Study of Complex Dynamical Neural Networks and its application to Brain Development and Emergent Synchronization Phenomena
Metadatos
Mostrar el registro completo del ítemAutor
Millán Vidal, Ana PaulaEditorial
Universidad de Granada
Director
Torres Agudo, Joaquín J.Departamento
Universidad de Granada. Programa de Doctorado en Física y MatemáticasMateria
Redes neuronales Física teórica Neurociencias Matemáticas
Fecha
2019Fecha lectura
2019-09-16Referencia bibliográfica
Millán Vidal, Ana Paula. Study of Complex Dynamical Neural Networks and its application to Brain Development and Emergent Synchronization Phenomena. Granada: Universidad de Granada, 2019. [http://hdl.handle.net/10481/57271]
Patrocinador
Tesis Univ. Granada.; Spanish Ministry of Science and Technology and the “Agencia Española de Investigación (AEI)” under grant FIS2017-84256-P (FEDER funds); “Obra Social La Caixa” (ID 100010434 with code LCF/BQ/ES15/10360004)Resumen
For the frist point, we study brain development and in particular the process of synaptic pruning. In fact, a fundamental question in neuroscience is why brain development proceeds via a severe synaptic pruning – that is, with an initial overgrowth of synapses, followed by the subsequent atrophy of approximately half of them throughout infancy. It is clear that fewer synapses require less metabolic energy, but why not start with the optimal synaptic density? In this thesis we present an adaptive neural network model that shows that the memory performance of the system does indeed depend on whether it passed through a transient period of relatively high synaptic density. Furthermore, the presented model also provides a simple demonstration of how network structure can be optimized by pruning with a rule that only depends on local information at each synapse – the intensity of electrical current – that is consistent with empirical results on synaptic growth and death. In this view, a neural network would begin life as a more or less random structure with a sufficiently high synaptic density that is capable of memory performance. Throughout infancy,
certain memories are learnt, and pruning gradually eliminates synapses experiencing less electrical activity. Eventually, a network architecture emerges which has lower mean synaptic density but is still capable, thanks to a more optimal structure, of retrieving memories. Moreover, the network structure will be optimized for the specific patterns it stored. This seems consistent with the fact that young children can acquire memory patterns (such as languages or artistic skills) which remain with them indefinitely, yet as adults they struggle to learn new ones.