Leo Breiman - Bagging Predictors (1996)

History / Edit / PDF / EPUB / BIB /
Created: December 9, 2017 / Updated: November 2, 2024 / Status: finished / 1 min read (~113 words)
Machine learning

  • Given a training set $D$ of size $n$, bagging generates $m$ new training sets $D_i$, each of size $n'$, by sampling from $D$ uniformly and with replacement (the same sample may be present multiple times)
  • The $m$ models are fitted using the $m$ bootstrap samples and combined by averaging the output (for regression) or voting (for classification)

  • Bagging (bootstrap aggregating) can push a good but unstable procedure a significant step towards optimality
  • On the other hand, it can slightly degrade the performance of stable procedures