Документ взят из кэша поисковой машины. Адрес оригинального документа : http://graphics.cs.msu.ru/en/node/912
Дата изменения: Sun Apr 10 00:32:16 2016
Дата индексирования: Sun Apr 10 00:32:16 2016
Кодировка: UTF-8
GML AdaBoost Matlab Toolbox | Graphics and Media Lab

GML AdaBoost Matlab Toolbox

Contact person: ()

This project is devoted to create an easy and convenient Matlab based toolbox for investigations of AdaBoost based machine learning algorithms.

Download

Download GML AdaBoost Matlab Toolbox 0.4 (From Alexander Vezhnevets homepage at ETH Zurich).

Download GML AdaBoost Matlab Toolbox 0.3

Download GML AdaBoost Matlab Toolbox 0.2

GML AdaBoost Matlab Toolbox is set of matlab functions and classes implementing a family of classification algorithms, known as Boosting.

Implemented algorithms

So far we have implemented 3 different boosting schemes: Real AdaBoost, Gentle AdaBoost and Modest AdaBoost.

  • Real AdaBoost (see [2] for full description) is the generalization of a basic AdaBoost algorithm first introduced by Fruend and Schapire [1]. Real AdaBoost should be treated as a basic “hardcore” boosting algorithm.
  • Gentle AdaBoost is a more robust and stable version of real AdaBoost (see [3] for full description). So far, it has been the most practically efficient boosting algorithm, used, for example, in Viola-Jones object detector [4]. Our experiments show, that Gentle AdaBoost performs slightly better then Real AdaBoost on regular data, but is considerably better on noisy data, and much more resistant to outliers.
  • Modest AdaBoost (see [5] for a full description) – regularized tradeoff of AdaBoost, mostly aimed for better generalization capability and resistance to overfitting. Our experiments show, that in terms of test error and overfitting this algorithm outperforms both Real and Gentle AdaBoost.

Available weak learners

We have implemented a classification tree as a weak learner.

Additional functionalities

Alongside with 3 Boosting algorithms we also provide a class that should give you an easy way to make a crossvalidation test.

Using trained classifiers in C++ applications

In 0.3 version of toolbox you can save constructed classifier to file and load it in your C++ application. C++ code for loading and using saved classifier is provided.

Authors

This toolbox was developed and implemented by Alexander Vezhnevets, who was at that time an undergraduate student of Lomonosov Moscow State University. If you have any questions or suggestions, please mail: alexander.vezhnevets@inf.ethz.ch

Reference

[1] Y Freund and R. E. Schapire. Game theory, on-line prediction and boosting. In Proceedings of the Ninth Annual Conference on Computational Learning Theory, pages 325–332, 1996.

[2] R.E. Schapire and Y. Singer Improved boosting algorithms using confidence-rated predictions. Machine Learning, 37(3):297-336, December 1999.

[3] Jerome Friedman, Trevor Hastie, and Robert Tibshirani. Additive logistic regression: A statistical view of boosting. The Annals of Statistics, 38(2):337–374, April 2000.

[4] P. Viola and M. Jones. Robust Real-time Object Detection. In Proc. 2nd Int'l Workshop on Statistical and Computational Theories of Vision -- Modeling, Learning, Computing and Sampling, Vancouver, Canada, July 2001.

[5] Alexander Vezhnevets, Vladimir Vezhnevets 'Modest AdaBoost' - Teaching AdaBoost to Generalize Better. Graphicon-2005, Novosibirsk Akademgorodok, Russia, 2005.
.pdf (107kb)
[6] Newman, D.J. & Hettich, S. & Blake, C.L. & Merz, C.J. (1998). UCI Repository of machine learning databases [http://www.ics.uci.edu/~mlearn/MLRepository.html]. Irvine, CA: University of California, Department of Information and Computer Science. 

Team