Optimal resampling and classifier prototype selection in classifier ensembles using genetic algorithms

Loading...
Thumbnail Image

Date

Journal Title

Journal ISSN

Volume Title

Publisher

Springer

Access Rights

info:eu-repo/semantics/closedAccess

Abstract

Ensembles of classifiers that are trained on different parts of the input space provide good results in general. As a popular boosting technique, AdaBoost is an iterative and gradient based deterministic method used for this purpose where an exponential loss function is minimized. Bagging is a random search based ensemble creation technique where the training set of each classifier is arbitrarily selected. In this paper, a genetic algorithm based ensemble creation approach is proposed where both resampled training sets and classifier prototypes evolve so as to maximize the combined accuracy. The objective function based random search procedure of the resultant system guided by both ensemble accuracy and diversity can be considered to share the basic properties of bagging and boosting. Experimental results have shown that the proposed approach provides better combined accuracies using a fewer number of classifiers than AdaBoost.

Description

Keywords

classifier ensembles, optimal resampling, multiple prototype ensembles, diversity, boosting, bagging, genetic algorithms

Journal or Series

Pattern Analysis and Applications

WoS Q Value

Scopus Q Value

Volume

7

Issue

3

Citation

Endorsement

Review

Supplemented By

Referenced By