COGSTIM: Online Computational Modulation of Visual Perception.
Computational models of vision often address problems that have a single and definite end-point, such as visual recognition: an example of this might be to find a ripe banana in a complex scene. However, not all computation is of...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Información proyecto COGSTIM
Duración del proyecto: 29 meses
Fecha Inicio: 2022-08-12
Fecha Fin: 2025-01-31
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
Computational models of vision often address problems that have a single and definite end-point, such as visual recognition: an example of this might be to find a ripe banana in a complex scene. However, not all computation is of this form. Visual information is processed continuously in sensory areas and the nervous system has the capacity to alter or halt an ongoing behavioral response to changes in incoming information. We can therefore react flexibly to updated sensory input or changed requirements for motor output. On the other hand, these same neuronal mechanisms must also support perceptual stability, so that noisy signals do not cause loss of a crucial goal.
In project COGSTIM, I will investigate the functional neuronal networks that support the balance between perceptual flexibility and stability, within primate visual areas. I will use a highly innovative approach, combining dense electrophysiological recording with online (real-time) decoding of neuronal correlates of the subject’s perceptual choice, based on adaptive machine-learning algorithms. In order to control visual perception effectively and predictably, closed-loop electrical stimulation will be applied under dynamically adjusted feedback to identified neuronal circuits that causally modulate associated percepts. Crucially, this novel approach using joint decoding and stimulation in real time will allow me to target dynamically visual percepts, representing a significant advance in our understanding of on-going, continuous computations of the primate brain. Such developments offer promising bases for the future development of rehabilitative therapeutical protocols, as well as innovative brain machine interfaces suitable for real-world use.