Afficher la notice abrégée

dc.contributor.authorBolaños Martinez, Daniel
dc.contributor.authorDurán López, Alberto
dc.contributor.authorGarrido Bullejos, José Luis 
dc.contributor.authorDelgado Márquez, Blanca Luisa 
dc.contributor.authorBermúdez Edo, María del Campo 
dc.date.accessioned2025-01-29T06:49:58Z
dc.date.available2025-01-29T06:49:58Z
dc.date.issued2025-05-01
dc.identifier.citationBolaños-Martinez, D., Durán-López, A., Garrido, J. L., Delgado-Márquez, B., & Bermudez-Edo, M. (2025). SASD: Self-Attention for Small Datasets—A case study in smart villages. Expert Systems with Applications, 126245.es_ES
dc.identifier.urihttps://hdl.handle.net/10481/100835
dc.description.abstractUnderstanding repeat visitation patterns in tourism is important for optimizing economic benefits, as loyal visitors significantly contribute to the stability and growth of destinations. However, this area remains underexplored, especially in smart villages where data limitations challenge traditional machine learning (ML) approaches. Although neural networks (NN) have proven effective in various research fields, they struggle with small datasets. We propose a ML application for tracing repeat visitors using NN suitable for small datasets. Specifically, we designed SASD (Self-Attention for Small Dataset), a deep learning architecture that incorporates self-attention layers to address data limitations. We applied SASD to predict tourists’ visit intentionality in the next 12 months in a smart village region, using as training data, information from License Plate Recognition sensors, and questionnaires. We evaluated its performance against various ML algorithms; Decision Trees, Random Forests, K-NN, Logistic Regression, Gradient Boosting, Naive Bayes, SVM, MLP, RNN, and LSTM, TabNet and TabTransformer. Our results demonstrate greater accuracy, recall, precision, and F1-score. Specifically, SASD outperforms other models by up to 3% on the weighted average F1 score. Our results also confirm that in NN, the incorporation of self-attention layers accelerates convergence and reduces processing time by 32%. The best results are achieved with two self-attention layers placed at the beginning and end of the NN. Our results provide insights for policymakers, business managers, local communities, and environmental organizations, enabling informed decisions and optimal resource allocation for tourism development.es_ES
dc.language.isoenges_ES
dc.publisherPergamon Elsevieres_ES
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectSelf-attentiones_ES
dc.subjectDeep learninges_ES
dc.subjectInternet of Thingses_ES
dc.subjectTourism developmentes_ES
dc.subjectRepeat tourismes_ES
dc.subjectSensorses_ES
dc.titleSASD: Self-Attention for Small Datasets- A case study in smart villageses_ES
dc.typejournal articlees_ES
dc.rights.accessRightsembargoed accesses_ES
dc.identifier.doihttps://doi.org/10.1016/j.eswa.2024.126245


Fichier(s) constituant ce document

[PDF]

Ce document figure dans la(les) collection(s) suivante(s)

Afficher la notice abrégée

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Excepté là où spécifié autrement, la license de ce document est décrite en tant que Attribution-NonCommercial-NoDerivatives 4.0 Internacional