I’m presenting an overview/tutorial presentation on Adaboost (the vote was 3-2-1 adaboost-bayes nets-unlabeled data). I’ll give some background, cover the basic adaboost algorithms, present some theoretical results, discuss practical issues such as the complexity of the weak learner and noisy data, and talk about some of the Adaboost variants. I’ll be focusing on confidence-weighted adaboost in the two-class case. Recommended reading is one of the following that can all be found here.
Robert E. Schapire.
The boosting approach to machine learning: An overview.
In MSRI Workshop on Nonlinear Estimation and Classification, 2002.
Jerome Friedman, Trevor Hastie and Robert Tibshirani.
Additive logistic regression: a statistical view of boosting.
The Annals of Statistics, 38(2):337-374, April, 2000.
Robert E. Schapire and Yoram Singer.
Improved boosting algorithms using confidence-rated predictions.
Machine Learning, 37(3):297-336, 1999.