Afficher la notice abrégée

dc.contributor.authorSchmidt, Arne
dc.contributor.authorMorales Álvarez, Pablo 
dc.contributor.authorCooper, L.A,
dc.contributor.authorNewberg, L.A.
dc.contributor.authorEnquobahrie, A
dc.contributor.authorMolina Soriano, Rafael 
dc.contributor.authorKatsaggelos, Aggelos
dc.date.accessioned2025-01-15T08:06:37Z
dc.date.available2025-01-15T08:06:37Z
dc.date.issued2024
dc.identifier.citationSchmidt, A., Morales-Álvarez, P., Cooper, L.A., Newberg, L.A., Enquobahrie, A., Molina, R. , Katsaggelos, A.K. , Focused active learning for histopathological image classification, Medical Image Analysis, volumen 95, 2024es_ES
dc.identifier.urihttps://hdl.handle.net/10481/99172
dc.description.abstractActive Learning (AL) has the potential to solve a major problem of digital pathology: the efficient acquisition of labeled data for machine learning algorithms. However, existing AL methods often struggle in realistic settings with artifacts, ambiguities, and class imbalances, as commonly seen in the medical field. The lack of precise uncertainty estimations leads to the acquisition of images with a low informative value. To address these challenges, we propose Focused Active Learning (FocAL), which combines a Bayesian Neural Network with Out-of-Distribution detection to estimate different uncertainties for the acquisition function. Specifically, the weighted epistemic uncertainty accounts for the class imbalance, aleatoric uncertainty for ambiguous images, and an OoD score for artifacts. We perform extensive experiments to validate our method on MNIST and the real-world Panda dataset for the classification of prostate cancer. The results confirm that other AL methods are ’distracted’ by ambiguities and artifacts which harm the performance. FocAL effectively focuses on the most informative images, avoiding ambiguities and artifacts during acquisition. For both experiments, FocAL outperforms existing AL approaches, reaching a Cohen’s kappa of 0.764 with only 0.69% of the labeled Panda data.es_ES
dc.description.sponsorshipThis work was supported by the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska Curie grant agreement No 860627 (CLARIFY Project), the US National Institutes of Health National Library of Medicine grant R01LM013523, and the project PID2022-140189OB-C22 funded by MCIN / AEI / 10.13039 / 501100011033. PMA acknowledges grant C-EXP-153-UGR23 funded by Consejería de Universidad, Investigación e Innovación and by ERDF Andalusia Program.es_ES
dc.language.isoenges_ES
dc.publisherElsevieres_ES
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectActive Learninges_ES
dc.subjectCancer Classificationes_ES
dc.subjectHistological Imageses_ES
dc.subjectBayesian Deep Learninges_ES
dc.titleFocused active learning for histopathological image classificationes_ES
dc.typejournal articlees_ES
dc.rights.accessRightsopen accesses_ES
dc.identifier.doi10.1016/J.MEDIA.2024.103162
dc.type.hasVersionAMes_ES


Fichier(s) constituant ce document

[PDF]

Ce document figure dans la(les) collection(s) suivante(s)

Afficher la notice abrégée

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Excepté là où spécifié autrement, la license de ce document est décrite en tant que Attribution-NonCommercial-NoDerivatives 4.0 Internacional