Abstract
If one has a multiclass classification problem and wants to boost a multiclass base classifier AdaBoost.M1 is a well known and widely applicated boosting algorithm. However AdaBoost.M1 does not work, if the base classifier is too weak. We show, that with a modification of only one line of AdaBoost.M1 one can make it usable for weak base classifiers, too. The resulting classifier AdaBoost.M1Wis guaranteed to minimize an upper bound for a performance measure, called the guessing error, as long as the base classifier is better than random guessing. The usability of AdaBoost.M1W could be clearly demonstrated experimentally.
| Original language | English |
|---|---|
| Pages | 72-83 |
| Publication status | Published - 2003 |
| Event | European Conference on Machine Learning - Helsinki, Finland Duration: 12 Aug 2002 → … |
Conference
| Conference | European Conference on Machine Learning |
|---|---|
| Country/Territory | Finland |
| Period | 12/08/02 → … |
Classification according to Österreichische Systematik der Wissenschaftszweige (ÖFOS 2012)
- 102019 Machine learning
Applied Research Level (ARL)
- ARL Level 2 - Description of the application of a principle
Research focus/foci
- Not applicable