Perception is a complex process, where prior knowledge is incorporated into the current percept to help the brain cope with sensory uncertainty. A crucial question is how this mechanism changes during interaction, when the brain i...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Información proyecto wHiSPER
Duración del proyecto: 73 meses
Fecha Inicio: 2018-11-06
Fecha Fin: 2024-12-31
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
Perception is a complex process, where prior knowledge is incorporated into the current percept to help the brain cope with sensory uncertainty. A crucial question is how this mechanism changes during interaction, when the brain is faced with two conflicting goals: either optimizing individual perception by using internal priors, or maximizing perceptual alignment with the partner, by limiting the reliance on individual priors. wHiSPER proposes to study for the first time how visual perception of space and time is modified during interaction, by moving the investigation to an interactive shared context, where two agents dynamically influence each other. To allow for scrupulous and systematic control during interaction, wHiSPER will use a humanoid robot as a controllable interactive agent. The research will be articulated along five main objectives: i) determine how being involved in an interactive context influences perceptual inference; ii) assess how perceptual priors generalize to the observation of other’s actions; iii) understand whether and how individual perception aligns to others’ priors; iv) assess how is it possible to enable shared perception with a robot and v) determine whether perceptual inference during interaction is modified with aging, when lowered sensory acuity could increase priors relevance. To these aims wHiSPER will exploit rigorous psychophysical methods, Bayesian modeling and human-robot interaction, by adapting well-established paradigms in the study of visual perception to a novel interactive context. In several experiments the humanoid robot and the participants will be shown simple temporal or spatial perceptual stimuli that they will have to perceive either to reproduce them or to perform a coordinated joint action (as passing an object). The measures of the reproduced intervals and of the kinematics of the actions will allow to quantify through Bayesian modeling how social interaction influences visual perception.