How to make AdaBoost.M1 work for weak base classifiers by changing only one line of the code

Günther Eibl, Karl Peter Pfeiffer

Research output: Contribution to conferencePaperpeer-review

Abstract

If one has a multiclass classification problem and wants to boost a multiclass base classifier AdaBoost.M1 is a well known and widely applicated boosting algorithm. However AdaBoost.M1 does not work, if the base classifier is too weak. We show, that with a modification of only one line of AdaBoost.M1 one can make it usable for weak base classifiers, too. The resulting classifier AdaBoost.M1Wis guaranteed to minimize an upper bound for a performance measure, called the guessing error, as long as the base classifier is better than random guessing. The usability of AdaBoost.M1W could be clearly demonstrated experimentally.
Original languageEnglish
Pages72-83
Publication statusPublished - 2003
EventEuropean Conference on Machine Learning - Helsinki, Finland
Duration: 12 Aug 2002 → …

Conference

ConferenceEuropean Conference on Machine Learning
Country/TerritoryFinland
Period12/08/02 → …

Classification according to Österreichische Systematik der Wissenschaftszweige (ÖFOS 2012)

  • 102019 Machine learning

Applied Research Level (ARL)

  • ARL Level 2 - Description of the application of a principle

Research focus/foci

  • Not applicable

Cite this