Embedded machine-readable molecular representation for resource-efficient deep learning applications
Metadatos
Mostrar el registro completo del ítemAutor
Nuñez Andrade, Emilio; Vidal-Daza, Isaac; Ryan, James W.; Gómez- Bombarelli, Rafael; Martin Martinez, Francisco J.Editorial
Royal Society of Chemistry
Fecha
2025-02-03Referencia bibliográfica
Nuñez Andrade, Emilio; Vidal-Daza, Isaac; Ryan, James W. ; Gómez- Bombarelli, Rafael; Martin Martinez, Francisco J. Embedded machine-readable molecular representation for resource-efficient deep learning applications. Digital Discovery, 2025, Advance Article
Patrocinador
Engineering and Physical Sciences Research Council (EPSRC) Program Grant EP/T028513/ 1 “Application Targeted and Integrated Photovoltaics”,; Royal Society of Chemistry (RSC) Enablement Grant (E21- 7051491439); RSC Enablement Grant (E21- 8254227705); Google Cloud Research Credits program with the award GCP19980904; EPSRC PhD scholarship ref. 2602452; Consejo Nacional de Humanidades, Ciencias y Tecnologías (CONAHCYT) PhD scholarship ref. 809702.Resumen
The practical implementation of deep learning methods for chemistry applications relies on encoding chemical structures into machine-readable formats that can be efficiently processed by computational tools. To this end, One Hot Encoding (OHE) is an established representation of alphanumeric categorical data in expanded numerical matrices. We have developed an embedded alternative to OHE that encodes discrete alphanumeric tokens of an N-sized alphabet into a few real numbers that constitute a simpler matrix representation of chemical structures. The implementation of this embedded One Hot Encoding (eOHE) in training machine learning models achieves comparable results to OHE in model accuracy and robustness while significantly reducing the use of computational resources. Our benchmarks across three molecular representations (SMILES, DeepSMILES, and SELFIES) and three different molecular databases (ZINC, QM9, and GDB-13) for Variational Autoencoders (VAEs) and Recurrent Neural Networks (RNNs) show that using eOHE reduces vRAM memory usage by up to 50% while increasing disk Memory Reduction Efficiency (MRE) to 80% on average. This encoding method opens up new avenues for data representation in embedded formats that promote energy efficiency and scalable computing in resource-constrained devices or in scenarios with limited computing resources. The application of eOHE impacts not only the chemistry field but also other disciplines that rely on the use of OHE.