Poster
Margin-Based Generalization Lower Bounds for Boosted Classifiers
Allan Grønlund · Lior Kamma · Kasper Green Larsen · Alexander Mathiasen · Jelani Nelson
East Exhibition Hall B, C #8
Keywords: [ Learning Theory ] [ Theory ] [ Algorithms -> Boosting and Ensemble Methods; Algorithms -> Classification; Algorithms ] [ Large Margin Methods ]
[
Abstract
]
Abstract:
Boosting is one of the most successful ideas in machine learning.
The most well-accepted explanations for the low generalization error
of boosting algorithms such as AdaBoost stem from
margin theory. The study of margins in the context of boosting
algorithms was initiated by Schapire, Freund, Bartlett and Lee (1998),
and has inspired numerous boosting algorithms and generalization bounds.
To date, the strongest known generalization (upper bound) is the $k$th margin
bound of Gao and Zhou (2013).
Despite the numerous generalization upper bounds that have been proved
over the last two decades, nothing is known about the tightness of these bounds.
In this paper, we give the first margin-based lower bounds on the generalization
error of boosted classifiers. Our lower bounds nearly match the
$k$th margin bound and thus almost settle the generalization performance
of boosted classifiers in terms of margins.
Live content is unavailable. Log in and register to view live content