Decoding the Mind: Neural Differences and Semantic Representation in Perception and Imagination Across Modalities Khanday, Owais Mujtaba Ouellet, Marc Pérez Córdoba, José Luis Sbaih, Asma Hasan Miccoli, Laura González López, José Andrés This work was supported by grant PID2022-141378OB-C22 funded by MICIU/AEI/10.13039/501100011033 and by ERDF/EU. This study undertakes an analysis of neural signals related to perception and imagination concepts, aiming to enhance communication capabalities for individuals with speech impairments. The investigation utilizes publicly available Electroencephalography( EEG) data acquired through a 124-channel ANT Neuro eego Mylab EEG system (ANT Neuro B.V., Hengelo, Netherlands). The dataset includes 11,554 trials from 12 participants. The proposed convolutional neural network (CNN) model outperformed others in classifying the EEG data as being from the perception or the imagined speech task conditions, achieving a test accuracy of 77.89%. Traditional machine learning models, including Random Forest(RF), Support Vector Classifier (SVC), and XGBoost, showed tendencies to overfit, resulting in low accuracies. as for the semantic decoding, unfortunately, the different models performed at the chance level. 2024-11-11T10:31:47Z 2024-11-11T10:31:47Z 2024-11-11 conference output Khanday, O.M., Ouellet, M., Pérez-Córdoba, J.L., Sbaih, A.H., Miccoli, L., Gonzalez-Lopez, J.A. (2024) Decoding the Mind: Neural Differences and Semantic Representation in Perception and Imagination Across Modalities. Proc. IberSPEECH 2024, 126-130, doi: 10.21437/IberSPEECH.2024-26 https://hdl.handle.net/10481/96820 10.21437/IberSPEECH.2024-26 eng http://creativecommons.org/licenses/by-nc-nd/3.0/ open access Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License Internation Speech Communication Association (ISCA)