Balanced training of a hybrid ensemble method for imbalanced datasets: a case of emergency department readmission prediction
Author
dc.contributor.author
Artetxe, Arkaitz
Author
dc.contributor.author
Graña, Manuel
Author
dc.contributor.author
Beristain, Andoni
Author
dc.contributor.author
Ríos Pérez, Sebastián
Admission date
dc.date.accessioned
2020-06-02T19:25:43Z
Available date
dc.date.available
2020-06-02T19:25:43Z
Publication date
dc.date.issued
2020
Cita de ítem
dc.identifier.citation
Neural Comput & Applic (2020) 32:5735–5744
es_ES
Identifier
dc.identifier.other
10.1007/s00521-017-3242-y
Identifier
dc.identifier.uri
https://repositorio.uchile.cl/handle/2250/175142
Abstract
dc.description.abstract
Dealing with imbalanced datasets is a recurrent issue in health-care data processing. Most literature deals with small academic datasets, so that results often do not extrapolate to the large real-life datasets, or have little real-life validity. When minority class sample generation by interpolation is meaningless, the recourse to undersampling the majority class is mandatory in order to reach some acceptable results. Ensembles of classifiers provide the advantage of the diversity of their members, which may allow adaptation to the imbalanced distribution. In this paper, we present a pipeline method combining random undersampling with bootstrap aggregation (bagging) for a hybrid ensemble of extreme learning machines and decision trees, whose diversity improves adaptation to the imbalanced class dataset. The approach is demonstrated on a realistic greatly imbalanced dataset of emergency department patients from a Chilean hospital targeted to predict patient readmission. Computational experiments show that our approach outperforms other well-known classification algorithms.