A Comparative Study of Voice Perception in Primates
With COVOPRIM I propose to reconstruct the recent evolutionary history of one often overlooked component of speech and language: voice perception. Perceptual and neural mechanisms of voice perception will be compared between human...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Proyectos interesantes
LEARN 2 HEAR & SEE
Perceptual Contextual and Cross modal Learning in Hearing...
162K€
Cerrado
PSI2013-43107-P
INFLUENCIA DE LA INFORMACION FACIAL EN LA COMPRENSION DEL LE...
48K€
Cerrado
CCINB
Cortical Contributions to Innate Vocalizations
252K€
Cerrado
PSI2011-26850
APRENDIZAJE PERCEPTIVO: TRANSFERENCIA A ESTIMULOS NOVEDOSOS.
22K€
Cerrado
RATLAND
Understanding Auditory Information Processing in Naturalisti...
2M€
Cerrado
PSI2009-08607
INFLUENCIA DEL CONTENIDO EMOCIONAL SOBRE LA COMPRENSION DEL...
32K€
Cerrado
Información proyecto COVOPRIM
Duración del proyecto: 89 meses
Fecha Inicio: 2018-07-19
Fecha Fin: 2025-12-31
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
With COVOPRIM I propose to reconstruct the recent evolutionary history of one often overlooked component of speech and language: voice perception. Perceptual and neural mechanisms of voice perception will be compared between humans, macaques and marmosets –two highly vocal and extensively studied monkey species–to quantify cross-species differences and infer mechanisms potentially inherited from a common ancestor. Two key building blocks of vocal communication detailed in my past research in humans will be compared across species: (1) the sensitivity to conspecific vocalizations, and (2) the processing of speaker/caller identity.
COVOPRIM is organized in three workpackages (WPs). WP1 will use large-scale behavioural testing based on ad-lib access of monkeys to automated test systems (following the highly successful model developed locally with baboons). Two main behavioural experiments will establish psychometric response functions for robust cross-species comparison. WP2 will use functional magnetic resonance imaging (fMRI) to measure cerebral activity during auditory stimulation in the three species. I will compare across brains the organization of what I hypothesize constitutes a voice patch system similar to the face patch system of visual cortex and broadly conserved in primates. I will also take advantage of the monkey models and use long-term, subject-specific enrichments of the auditory stimulation to probe the experience-dependence of neural coding in the voice patch system—an outstanding issue in human voice perception. WP3 will use fMRI-guided microstimulation in monkeys and transcranial magnetic stimulation in humans to establish the effective connectivity within the voice patch system and test the causal relation between voice patch neuronal activity and voice perception behaviour.
COVOPRIM is expected to generate considerable advances in our understanding of the recent evolution in primates of the perceptual and neural mechanisms of voice perception.