Afficher la notice abrégée

dc.contributor.authorCuadrado, Javier
dc.contributor.authorRançon, Ulysse
dc.contributor.authorCottereau, Benoit R.
dc.contributor.authorBarranco Expósito, Francisco 
dc.contributor.authorMasquelier, Timothée
dc.date.accessioned2024-10-28T09:49:44Z
dc.date.available2024-10-28T09:49:44Z
dc.date.issued2023-05-11
dc.identifier.citationCuadrado J, Rançon U, Cottereau BR, Barranco F and Masquelier T (2023) Optical flow estimation from event-based cameras and spiking neural networks. Front. Neurosci. 17:1160034. doi: 10.3389/fnins.2023.1160034es_ES
dc.identifier.urihttps://hdl.handle.net/10481/96382
dc.descriptionThis research was supported in part by the Agence Nationale de la Recherche under Grant ANR-20-CE23-0004-04 DeepSee, by the Spanish National Grant PID2019-109434RA-I00/ SRA (State Research Agency /10.13039/501100011033), by a FLAG-ERA funding (Joint Transnational Call 2019, project DOMINO), and by the Program DesCartes and by the National Research Foundation, Prime Minister’s Office, Singapore under its Campus for Research Excellence and Technological Enterprise (CREATE) Program.es_ES
dc.descriptionThe Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fnins.2023. 1160034/full#supplementary-materiales_ES
dc.description.abstractEvent-based cameras are raising interest within the computer vision community. These sensors operate with asynchronous pixels, emitting events, or “spikes”, when the luminance change at a given pixel since the last event surpasses a certain threshold. Thanks to their inherent qualities, such as their low power consumption, low latency, and high dynamic range, they seem particularly tailored to applications with challenging temporal constraints and safety requirements. Event-based sensors are an excellent fit for Spiking Neural Networks (SNNs), since the coupling of an asynchronous sensor with neuromorphic hardware can yield real-time systems with minimal power requirements. In this work, we seek to develop one such system, using both event sensor data from the DSEC dataset and spiking neural networks to estimate optical flow for driving scenarios. We propose a U-Net-like SNN which, after supervised training, is able to make dense optical flow estimations. To do so, we encourage both minimal norm for the error vector and minimal angle between ground-truth and predicted flow, training our model with back-propagation using a surrogate gradient. In addition, the use of 3d convolutions allows us to capture the dynamic nature of the data by increasing the temporal receptive fields. Upsampling after each decoding stage ensures that each decoder’s output contributes to the final estimation. Thanks to separable convolutions, we have been able to develop a light model (when compared to competitors) that can nonetheless yield reasonably accurate optical flow estimates.es_ES
dc.description.sponsorshipAgence Nationale de la Recherche ANR-20-CE23-0004-04 DeepSeees_ES
dc.description.sponsorshipSpanish National Grant PID2019-109434RA-I00/ SRAes_ES
dc.description.sponsorshipFLAG-ERA project DOMINOes_ES
dc.description.sponsorshipProgram DesCarteses_ES
dc.description.sponsorshipNational Research Foundation, Prime Minister’s Office, Singaporees_ES
dc.language.isoenges_ES
dc.publisherFrontierses_ES
dc.rightsAtribución 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.subjectOptical flowes_ES
dc.subjectEvent visiones_ES
dc.subjectSpiking neural networkes_ES
dc.subjectNeuromorphic computinges_ES
dc.subjectEdge AIes_ES
dc.titleOptical flow estimation from event-based cameras and spiking neural networkses_ES
dc.typejournal articlees_ES
dc.rights.accessRightsopen accesses_ES
dc.identifier.doi10.3389/fnins.2023.1160034
dc.type.hasVersionVoRes_ES


Fichier(s) constituant ce document

[PDF]

Ce document figure dans la(les) collection(s) suivante(s)

Afficher la notice abrégée

Atribución 4.0 Internacional
Excepté là où spécifié autrement, la license de ce document est décrite en tant que Atribución 4.0 Internacional