Closing the loop in dynamic vision – from single photons to behaviour in extreme...
Closing the loop in dynamic vision – from single photons to behaviour in extreme light environments
Driving along a tree-lined avenue, we have all experienced how the rapid succession of light and shade disrupts our vision. Such conditions push even synthetic sensors to their limits, but many animals master these challenges on a...
ver más
31/08/2029
UKON
2M€
Presupuesto del proyecto: 2M€
Líder del proyecto
UNIVERSITAT KONSTANZ
No se ha especificado una descripción o un objeto social para esta compañía.
TRL
4-5
Fecha límite participación
Sin fecha límite de participación.
Financiación
concedida
El organismo HORIZON EUROPE notifico la concesión del proyecto
el día 2023-11-06
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Información proyecto DynamicVision
Duración del proyecto: 69 meses
Fecha Inicio: 2023-11-06
Fecha Fin: 2029-08-31
Líder del proyecto
UNIVERSITAT KONSTANZ
No se ha especificado una descripción o un objeto social para esta compañía.
TRL
4-5
Presupuesto del proyecto
2M€
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
Driving along a tree-lined avenue, we have all experienced how the rapid succession of light and shade disrupts our vision. Such conditions push even synthetic sensors to their limits, but many animals master these challenges on a daily—and nightly—basis. Indeed, a high dynamic range of sensory information is a hallmark of natural environments. Explaining how sensory information is processed with the limited bandwidth in neural circuits is key to a central goal of neuroscience: understanding the neural control of behaviour in natural contexts. This question extends beyond the processing of dynamic input by nervous systems to the closed-loop nature of animal behaviour itself: as senses guide an animal’s movements, the movements in turn shape the sensory input. It necessitates a paradigm-shift to a holistic approach considering dynamic inputs, neural processing and behavioural strategies in concert. I propose visually-guided flight in nocturnal moths as uniquely suited for approaching this challenge. Probing the system in dim light, when vision operates at its limits, offers straightforward performance readouts for all stages of the control loop. To do so, we will design a novel imaging system to quantify the dynamics of natural visual environments from a flying insect’s perspective. We will then measure how dynamic tuning adjusts peripheral neurons to compensate for these spatiotemporal light variations, and how they are integrated with movement predictions in motion neurons, to guide flight behaviour. Using a one-of-a-kind facility for large-scale animal tracking, we will record the moths’ flight behaviour at unprecedented precision to reveal the strategies that optimise sensory acquisition in these challenging light conditions. Combining all stages, this project will provide a coherent framework for studying the neural basis of natural behaviour in dynamic light environments—using a unique, ecologically impactful model to close the loop from sensing to acting.