Boosting with Noisy Data: Some Views from Statistical Theory

Wenxin Jiang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

This letter is a comprehensive account of some recent findings about AdaBoost in the presence of noisy data when approached from the perspective of statistical theory. We start from the basic assumption of weak hypotheses used in AdaBoost and study its validity and implications on generalization error. We recommend studying the generalization error and comparing it to the optimal Bayes error when data are noisy. Analytic examples are provided to show that running the unmodified AdaBoost forever will lead to overfit. On the other hand, there exist regularized versions of AdaBoost that are consistent, in the sense that the resulting prediction will approximately attain the optimal performance in the limit of large training samples.

Original languageEnglish (US)
Pages (from-to)789-810
Number of pages22
JournalNeural Computation
Volume16
Issue number4
DOIs
StatePublished - Apr 2004

ASJC Scopus subject areas

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Fingerprint

Dive into the research topics of 'Boosting with Noisy Data: Some Views from Statistical Theory'. Together they form a unique fingerprint.

Cite this