COGSTIM: Online Computational Modulation of Visual Perception.
Computational models of vision often address problems that have a single and definite end-point, such as visual recognition: an example of this might be to find a ripe banana in a complex scene. However, not all computation is of...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Proyectos interesantes
PID2019-111629GB-I00
DESCIFRANDO LOS MECANISMOS DE LA INFERENCIA PERCEPTIVA EN EL...
109K€
Cerrado
FIS2015-67876-P
PROCESAMIENTO DE INCERTIDUMBRE EN EL CEREBRO: TOMA DE DECISI...
65K€
Cerrado
PID2020-112838RB-I00
ESTRATEGIAS FLEXIBLES PARA LA INTEGRACION DE LA EVIDENCIA SE...
194K€
Cerrado
moreSense
The Motor Representation of Sensory Experience
1M€
Cerrado
BFU2017-86026-R
DINAMICA DE LOS CIRCUITOS NEURONALES DISTRIBUIDOS EN LA TOMA...
157K€
Cerrado
PSI2017-88136-P
BASES NEURALES DE LA CONSCIENCIA PERCEPTUAL Y LA CONSCIENCIA...
86K€
Cerrado
Información proyecto COGSTIM
Duración del proyecto: 29 meses
Fecha Inicio: 2022-08-12
Fecha Fin: 2025-01-31
Descripción del proyecto
Computational models of vision often address problems that have a single and definite end-point, such as visual recognition: an example of this might be to find a ripe banana in a complex scene. However, not all computation is of this form. Visual information is processed continuously in sensory areas and the nervous system has the capacity to alter or halt an ongoing behavioral response to changes in incoming information. We can therefore react flexibly to updated sensory input or changed requirements for motor output. On the other hand, these same neuronal mechanisms must also support perceptual stability, so that noisy signals do not cause loss of a crucial goal.
In project COGSTIM, I will investigate the functional neuronal networks that support the balance between perceptual flexibility and stability, within primate visual areas. I will use a highly innovative approach, combining dense electrophysiological recording with online (real-time) decoding of neuronal correlates of the subject’s perceptual choice, based on adaptive machine-learning algorithms. In order to control visual perception effectively and predictably, closed-loop electrical stimulation will be applied under dynamically adjusted feedback to identified neuronal circuits that causally modulate associated percepts. Crucially, this novel approach using joint decoding and stimulation in real time will allow me to target dynamically visual percepts, representing a significant advance in our understanding of on-going, continuous computations of the primate brain. Such developments offer promising bases for the future development of rehabilitative therapeutical protocols, as well as innovative brain machine interfaces suitable for real-world use.