In this paper we make an extensive study of different methods for building ensembles of classifiers. We examine variants of ensemble methods that are based on perturbing features. We illustrate the power of using these variants by applying them to a number of different problems. We find that the best performing ensemble is obtained by combining an approach based on random subspace with a cluster based input decimated ensemble and the principal direction oracle. Compared with other state-of-the-art stand-alone classifiers and ensembles, this method consistently performed well across twelve diverse benchmark datasets. Another useful finding is that this approach does not need parameters to be carefully tuned for each dataset (in contrast to the fundamental importance of parameters tuning when using SVM and extreme learning machines), making our ensemble method well suited for practitioners since there is less risk of over-training. Another interesting finding is that random subspace can be coupled with several other ensemble methods to improve performance.

A Combination of Methods for Building Ensemble of Classifiers / Nanni, L.; Brahnam, S.; Lumini, Alessandra. - ELETTRONICO. - (2012), pp. 1-7. (Intervento presentato al convegno 16th International Conference on Image Processing, Computer Vision, and Pattern Recognition tenutosi a Las Vegas, Nevada, USA nel 16-19 July 2012).

A Combination of Methods for Building Ensemble of Classifiers

LUMINI, ALESSANDRA
2012

Abstract

In this paper we make an extensive study of different methods for building ensembles of classifiers. We examine variants of ensemble methods that are based on perturbing features. We illustrate the power of using these variants by applying them to a number of different problems. We find that the best performing ensemble is obtained by combining an approach based on random subspace with a cluster based input decimated ensemble and the principal direction oracle. Compared with other state-of-the-art stand-alone classifiers and ensembles, this method consistently performed well across twelve diverse benchmark datasets. Another useful finding is that this approach does not need parameters to be carefully tuned for each dataset (in contrast to the fundamental importance of parameters tuning when using SVM and extreme learning machines), making our ensemble method well suited for practitioners since there is less risk of over-training. Another interesting finding is that random subspace can be coupled with several other ensemble methods to improve performance.
2012
proceedings of 16th International Conference on Image Processing, Computer Vision, and Pattern Recognition
1
7
A Combination of Methods for Building Ensemble of Classifiers / Nanni, L.; Brahnam, S.; Lumini, Alessandra. - ELETTRONICO. - (2012), pp. 1-7. (Intervento presentato al convegno 16th International Conference on Image Processing, Computer Vision, and Pattern Recognition tenutosi a Las Vegas, Nevada, USA nel 16-19 July 2012).
Nanni, L.; Brahnam, S.; Lumini, Alessandra
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/133741
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact