Mostrar el registro sencillo del ítem

dc.contributor.authorGargiulo, Francesco
dc.contributor.authorFujita, Hamido 
dc.date.accessioned2022-09-06T11:22:38Z
dc.date.available2022-09-06T11:22:38Z
dc.date.issued2022-07-12
dc.identifier.citationF. Gargiulo... [et al.], "An ELECTRA-Based Model for Neural Coreference Resolution," in IEEE Access, vol. 10, pp. 75144-75157, 2022, doi: [10.1109/ACCESS.2022.3189956]es_ES
dc.identifier.urihttp://hdl.handle.net/10481/76549
dc.description.abstractIn last years, coreference resolution has received a sensibly performance boost exploiting different pre-trained Neural Language Models, from BERT to SpanBERT until Longformer. This work is aimed at assessing, for the rst time, the impact of ELECTRA model on this task, moved by the experimental evidence of an improved contextual representation and better performance on different downstream tasks. In particular, ELECTRA has been employed as representation layer in an assessed neural coreference architecture able to determine entity mentions among spans of text and to best cluster them. The architecture itself has been optimized: i) by simplifying the modality of representation of spans of text but still considering both the context they appear and their entire content, ii) by maximizing both the number and length of input textual segments to exploit better the improved contextual representation power of ELECTRA, iii) by maximizing the number of spans of text to be processed, since potentially representing mentions, preserving computational ef ciency. Experimental results on the OntoNotes dataset have shown the effectiveness of this solution from both a quantitative and qualitative perspective, and also with respect to other state-of-the-art models, thanks to a more pro cient token and span representation. The results also hint at the possible use of this solution also for low-resource languages, simply requiring a pre-trained version of ELECTRA instead of language-speci c models trained to handle either spans of text or long documents.es_ES
dc.language.isoenges_ES
dc.publisherIEEEes_ES
dc.rightsAtribución 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.subjectConference resolutiones_ES
dc.subjectELECTRAes_ES
dc.subjectNeural language modeles_ES
dc.subjectOntoNoteses_ES
dc.subjectNatural language processinges_ES
dc.titleAn ELECTRA-Based Model for Neural Coreference Resolutiones_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses_ES
dc.identifier.doi10.1109/ACCESS.2022.3189956
dc.type.hasVersioninfo:eu-repo/semantics/publishedVersiones_ES


Ficheros en el ítem

[PDF]

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

Atribución 4.0 Internacional
Excepto si se señala otra cosa, la licencia del ítem se describe como Atribución 4.0 Internacional