Mostrar el registro sencillo del ítem

dc.contributor.authorArco Martín, Juan Eloy 
dc.contributor.authorOrtiz, Andrés
dc.contributor.authorGallego Molina 
dc.contributor.authorGorriz Sáez, Juan Manuel 
dc.contributor.authorRamírez Pérez De Inestrosa, Javier 
dc.date.accessioned2023-11-21T13:23:24Z
dc.date.available2023-11-21T13:23:24Z
dc.date.issued2022
dc.identifier.citationPublished version: Vol. 33, No. 04, 2350019 (2023) [10.1142/S0129065723500193]es_ES
dc.identifier.urihttps://hdl.handle.net/10481/85816
dc.description.abstractThe combination of different sources of information is currently one of the most relevant aspects in the diagnostic process of several diseases. In the field of neurological disorders, different imaging modalities providing structural and functional information are frequently available. Those modalities are usually analyzed separately, although a joint of the features extracted from both sources can improve the classification performance of Computer-Aided Diagnosis (CAD) tools. Previous studies have computed independent models from each individual modality and combined them in a subsequent stage, which is not an optimum solution. In this work, we propose a method based on the principles of siamese neural networks to fuse information from Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET). This framework quantifies the similarities between both modalities and relates them with the diagnostic label during the training process. The resulting latent space at the output of this network is then entered into an attention module in order to evaluate the relevance of each brain region at different stages of the development of Alzheimer's disease. The excellent results obtained and the high flexibility of the method proposed allow fusing more than two modalities, leading to a scalable methodology that can be used in a wide range of contexts.es_ES
dc.description.sponsorshipProjects PGC2018- 098813-B-C32 and RTI2018-098913-B100 (Spanish “Ministerio de Ciencia, Innovación y Universidades”)es_ES
dc.description.sponsorshipUMA20-FEDERJA-086, A-TIC-080- UGR18 and P20 00525 (Consejería de economía y conocimiento, Junta de Andalucía)es_ES
dc.description.sponsorshipEuropean Regional Development Funds (ERDF)es_ES
dc.description.sponsorshipSpanish “Ministerio de Universidades” through Margarita-Salas grantes_ES
dc.language.isoenges_ES
dc.publisherWorld Scientific Publishing Companyes_ES
dc.subjectMultimodal combinationes_ES
dc.subjectDeep learninges_ES
dc.subjectMedical imaginges_ES
dc.subjectSelf-attentiones_ES
dc.subjectSiamese neural networkes_ES
dc.titleEnhancing multimodal patterns in neuroimaging by siamese neural networks with self-attention mechanismes_ES
dc.typejournal articlees_ES
dc.rights.accessRightsembargoed accesses_ES
dc.identifier.doi10.1142/S0129065723500193
dc.type.hasVersionAMes_ES


Ficheros en el ítem

[PDF]

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem