Mostrar el registro sencillo del ítem
Increasing diversity in Random Forest learning algorithm via imprecise probabilities
dc.contributor.author | Abellán Mulero, Joaquín | |
dc.contributor.author | Mantas Ruiz, Carlos Javier | |
dc.contributor.author | García Castellano, Francisco Javier | |
dc.contributor.author | Moral García, Serafín | |
dc.date.accessioned | 2024-02-07T10:05:50Z | |
dc.date.available | 2024-02-07T10:05:50Z | |
dc.date.issued | 2018-05-01 | |
dc.identifier.citation | Published version: Abellán, J., Mantas, C. & Castellano, J. G. & Moral-García, S. (2018). Increasing diversity in Random Forest learning algorithm via imprecise probabilities. Expert Systems with Applications 97: 228–243. Doi: 10.1016/j.eswa.2017.12.029 | es_ES |
dc.identifier.uri | https://hdl.handle.net/10481/88516 | |
dc.description.abstract | Random Forest (RF) learning algorithm is considered a classifier of reference due its excellent performance. Its success is based on the diversity of rules generated from decision trees that are built via a procedure that randomizes instances and features. To find additional procedures for increasing the diversity of the trees is an interesting task. It has been considered a new split criterion, based on imprecise probabilities and general uncertainty measures, that has a clear dependence of a parameter and has shown to be more successful than the classic ones. Using that criterion in RF scheme, join with a random procedure to select the value of that parameter, the diversity of the trees in the forest and the performance are increased. This fact gives rise to a new classification algorithm, called Random Credal Random Forest (RCRF). The new method represents several improvements with respect to the classic RF: the use of a more successful split criterion which is more robust to noise than the classic ones; and an increasing of the randomness which facilitates the diversity of the rules obtained. In an experimental study, it is shown that this new algorithm is a clear enhancement of RF, especially when it applied on data sets with class noise, where the standard RF has a notable deterioration. The problem of overfitting that appears when RF classifies data sets with class noise is solved with RCRF. This new algorithm can be considered as a powerful alternative to be used on data with or without class noise. | es_ES |
dc.description.sponsorship | This work has been supported by the Spanish “Ministerio de Economía y Competitividad” and by “Fondo Europeo de Desarrollo Regional” (FEDER) under Project TEC2015-69496-R. | es_ES |
dc.language.iso | eng | es_ES |
dc.publisher | Elsevier | es_ES |
dc.subject | Classification | es_ES |
dc.subject | Ensemble schemes | es_ES |
dc.subject | Random forest | es_ES |
dc.subject | Imprecise probabilities | es_ES |
dc.subject | Uncertainty measures | es_ES |
dc.title | Increasing diversity in Random Forest learning algorithm via imprecise probabilities | es_ES |
dc.type | info:eu-repo/semantics/article | es_ES |
dc.rights.accessRights | info:eu-repo/semantics/openAccess | es_ES |
dc.identifier.doi | 10.1016/j.eswa.2017.12.029 | |
dc.type.hasVersion | info:eu-repo/semantics/submittedVersion | es_ES |