Skip to yearly menu bar Skip to main content


Poster

A Boosting-Type Convergence Result for AdaBoost.MH with Factorized Multi-Class Classifiers

Xin Zou · Zhengyu Zhou · Jingyuan Xu · Weiwei Liu

[ ]
Wed 11 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract: AdaBoost is a well-known algorithm in boosting. Schapire and Singer propose, an extension of AdaBoost, named AdaBoost.MH, for multi-class classification problems. K´egl shows empirically that AdaBoost.MH works better when the classical one-against-all base classifiers are replaced by factorized base classifiers containing a binary classifier and a vote (or code) vector. However, the factorization makes it much more difficult to provide a convergence result for the factorized version of AdaBoost.MH. Then, K´egl raises an open problem in COLT 2014 to look for a lower bound for $w_\Sigma^\prime$, which is essential for analyzing the convergence result for the factorized AdaBoost.MH. In this work, we resolve this open problem by giving a lower bound for $w_\Sigma^\prime$ and present a convergence result for AdaBoost.MH with factorized multi-class classifiers.

Live content is unavailable. Log in and register to view live content