EveNt DrivEn Active Vision for Object peRception (ENDEAVOR)
"Computer vision, leveraging deep learning in the last decade, has achieved unprecedented progress. However, it is largely relying on datasets of still images, thus using ""passive vision"". On the contrary, biological vision is a...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Proyectos interesantes
eMorph
Event Driven Morphological Computation for Embodied Systems
2M€
Cerrado
PID2019-105556GB-C33
PERCEPCION Y COGNICION NEUROMORFICA PARA ACTUACION ROBOTICA...
263K€
Cerrado
Rubedo CVM
FEASIBILITY STUDY FOR THE RUBEDO CVM INNOVATIVE OPTICAL CO...
71K€
Cerrado
TEC2009-10639-C04-04
VISION ULTRA-RAPIDA POR EVENTOS Y SIN FOTOGRAMAS. APLICACION...
31K€
Cerrado
PID2019-108398GB-I00
INTEGRACION DE MODELOS Y DATOS PARA SLAM ACTIVO ROBUSTO EN E...
108K€
Cerrado
DynAI
Omni-Supervised Learning for Dynamic Scene Understanding
2M€
Cerrado
Información proyecto ENDEAVOR
Duración del proyecto: 25 meses
Fecha Inicio: 2024-04-12
Fecha Fin: 2026-05-31
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
"Computer vision, leveraging deep learning in the last decade, has achieved unprecedented progress. However, it is largely relying on datasets of still images, thus using ""passive vision"". On the contrary, biological vision is a fundamentally active process of exploration to disambiguate objects, and yet, the potential of active vision for robotics remains underexplored.
The ENDEAVOR project seeks to redefine traditional static image analysis within fast online robotic applications.
This project integrates the computational models of Sensorimotor Contingency Theory (O'Regan and Noe, 2001) with event-driven perception and neuromorphic computing. Sensorimotor contingency represents the dynamic relationship between an agent's sensory inputs and motor actions in the environment.
Active sensory data generation naturally aligns with event-driven perception, tracking moving objects via agent-generated events, while neuromorphic computing minimises latency and energy use.
The humanoid robot iCub will hold objects and examine them from various perspectives through eye and wrist movements. The project capitalises on bioinspired hardware and software solutions, ultimately aiming to reduce computational demands, power consumption, and latency in intelligent systems.
ENDEAVOR offers three significant contributions to computer vision and robotics: (1) It introduces active vision strategies that enhance object perception. (2) It integrates event-based visual sensing with rapid and efficient parallel computation, leveraging neuromorphic computing principles. (3) The project establishes a benchmark that allows for both qualitative and quantitative evaluations, fostering comparisons among various approaches, including frame-based, event-based, and spiking-based systems.
The importance of this approach lies in the effort to reduce the storage of massive amounts of data while aiming for mW of power consumption."