| dc.contributor.author | Balderas Ruíz, Luis | |
| dc.contributor.author | Lastra Leidinger, Miguel | |
| dc.contributor.author | Benítez Sánchez, José Manuel | |
| dc.date.accessioned | 2025-01-07T11:03:14Z | |
| dc.date.available | 2025-01-07T11:03:14Z | |
| dc.date.issued | 2025-01-03 | |
| dc.identifier.citation | Balderas Ruíz, L. & Lastra Leidinger, M. & Benítez Sánchez, J.M. Appl. Sci. 2025, 15, 390 [https://doi.org/10.3390/app15010390] | es_ES |
| dc.identifier.uri | https://hdl.handle.net/10481/98483 | |
| dc.description.abstract | Large Language Models (LLMs) like BERT have gained significant prominence
due to their remarkable performance in various natural language processing tasks. However,
they come with substantial computational and memory costs. Additionally, they are
essentially black-box models, being challenging to explain and interpret. In this article, Persistent
BERT Compression and Explainability (PBCE) is proposed, a Green AI methodology
to prune BERT models using persistent homology, aiming to measure the importance of
each neuron by studying the topological characteristics of their outputs. As a result, PBCE
can compress BERT significantly by reducing the number of parameters (47% of the original
parameters for BERT Base, 42% for BERT Large). The proposed methodology has been
evaluated on the standard GLUE Benchmark, comparing the results with state-of-the-art
techniques achieving outstanding results. Consequently, PBCE can simplify the BERT
model by providing explainability to its neurons and reducing the model’s size, making it
more suitable for deployment on resource-constrained devices. | es_ES |
| dc.description.sponsorship | Project with reference PID2020-118224RB-100
(funded by MICIU/AEI/10.13039/501100011033) | es_ES |
| dc.description.sponsorship | Project PID2023-151336OB-I00 granted by the Spanish
Ministerio de Ciencia, Innovación y Universidades | es_ES |
| dc.language.iso | eng | es_ES |
| dc.publisher | MDPI | es_ES |
| dc.rights | Atribución 4.0 Internacional | * |
| dc.rights.uri | http://creativecommons.org/licenses/by/4.0/ | * |
| dc.subject | BERT compression | es_ES |
| dc.subject | Green AI | es_ES |
| dc.subject | persistent homology | es_ES |
| dc.title | A Green AI Methodology Based on Persistent Homology for Compressing BERT | es_ES |
| dc.type | journal article | es_ES |
| dc.rights.accessRights | open access | es_ES |
| dc.identifier.doi | 10.3390/app15010390 | |
| dc.type.hasVersion | VoR | es_ES |