AdaBoost is one of the most popular classification methods. In contrast to other ensemble methods (e.g., Bagging) the AdaBoost is inherently sequential. In many data intensive real-world ituations this may limit the practical applicability of the method. P-AdaBoost is a novel scheme for the parallelization of AdaBoost, which builds upon earlier results concerning the dynamics of AdaBoost weights. P-AdaBoost yields approximations to the standard AdaBoost models that can be easily and efficiently distributed over a network of computing nodes. Properties of P- daBoost as a stochastic minimizer of the AdaBoost cost functional are discussed. Experiments are reported on both synthetic and benchmark data sets.
Parallelizing AdaBoost by weights dynamics.
Merler, Stefano;Caprile, Bruno Giovanni;Furlanello, Cesare
2007-01-01
Abstract
AdaBoost is one of the most popular classification methods. In contrast to other ensemble methods (e.g., Bagging) the AdaBoost is inherently sequential. In many data intensive real-world ituations this may limit the practical applicability of the method. P-AdaBoost is a novel scheme for the parallelization of AdaBoost, which builds upon earlier results concerning the dynamics of AdaBoost weights. P-AdaBoost yields approximations to the standard AdaBoost models that can be easily and efficiently distributed over a network of computing nodes. Properties of P- daBoost as a stochastic minimizer of the AdaBoost cost functional are discussed. Experiments are reported on both synthetic and benchmark data sets.File | Dimensione | Formato | |
---|---|---|---|
Parallelizing AdaBoost by weights dynamics.pdf
non disponibili
Tipologia:
Documento in Post-print
Licenza:
DRM non definito
Dimensione
370.19 kB
Formato
Adobe PDF
|
370.19 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.