Robust optimal classification trees under noisy labels
Metadatos
Mostrar el registro completo del ítemEditorial
Springer
Materia
Multiclass classification Optimal classification trees Support vector machines Mixed integer non linear programming Classification Hyperplanes
Fecha
2021-10-05Referencia bibliográfica
Blanco, V., Japón, A. & Puerto, J. Robust optimal classification trees under noisy labels. Adv Data Anal Classif (2021). [https://doi.org/10.1007/s11634-021-00467-2]
Patrocinador
Spanish Ministerio de Ciencia e Innovacion, Agencia Estatal de Investigacion/FEDER PID2020-114594GBC21; Junta de Andalucia P18-FR-1422 P18-FR-2369; NetmeetData-Ayudas Fundacion BBVA a equipos de investigacion cientifica 2019; IMAG-Maria de Maeztu CEX2020-001105-M /AEI /10.13039/501100011033; FEDERUS-1256951 BFQM-322-UGR20 CEI-3-FQM331Resumen
In this paper we propose a novel methodology to construct Optimal Classification
Trees that takes into account that noisy labels may occur in the training sample. The
motivation of this new methodology is based on the superaditive effect of combining
together margin based classifiers and outlier detection techniques. Our approach rests
on two main elements: (1) the splitting rules for the classification trees are designed
to maximize the separation margin between classes applying the paradigm of SVM;
and (2) some of the labels of the training sample are allowed to be changed during the
construction of the tree trying to detect the label noise. Both features are considered
and integrated together to design the resulting Optimal Classification Tree.We present
a Mixed Integer Non Linear Programming formulation for the problem, suitable to
be solved using any of the available off-the-shelf solvers. The model is analyzed and
tested on a battery of standard datasets taken from UCI Machine Learning repository,
showing the effectiveness of our approach. Our computational results show that in
most cases the new methodology outperforms both in accuracy and AUC the results
of the benchmarks provided by OCT and OCT-H.