Sustained Emotionally coloured Machine human Interaction using Nonverbal Express...
Sustained Emotionally coloured Machine human Interaction using Nonverbal Expression
The aim of the SEMAINE project is to draw together the current research on non-verbal signs and to produce a system that capitalises on them to achieve genuinely sustained, emotionally coloured interactions between a person and a...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Proyectos interesantes
ARIA-VALUSPA
Artificial Retrieval of Information Assistants Virtual Age...
3M€
Cerrado
JAMES
Joint Action for Multimodal Embodied Social Systems
4M€
Cerrado
HOL-DEEP-SENSE
Holistic Deep Modelling for User Recognition and Affective S...
200K€
Cerrado
PID2021-126061OB-C43
DESCUBRIENDO EL SIGNIFICADO Y LA INTENCION MAS ALLA DE LA PA...
152K€
Cerrado
MIRROR
Micro-behaviors Recognitions through Nonverbal Signals
196K€
Cerrado
ACMod
Affective Computing Models: from Facial Expression to Mind-R...
846K€
Cerrado
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
The aim of the SEMAINE project is to draw together the current research on non-verbal signs and to produce a system that capitalises on them to achieve genuinely sustained, emotionally coloured interactions between a person and a machine. To do that, it develops an idea which was introduced in the FP5 project ERMIS and pursued in HUMAINE. The idea is to build a 'Sensitive Artificial Listener', or SAL for short, which interacts with human beings in a shallow but emotion-rich way, using interactions observed in chat shows and at parties as a model. The verbal capabilities of such a system are very limited and superficial, similar to 'chatbots' from ELIZA on. Like them, it uses rules to respond to what a listener says without understanding the content in any depth. Unlike chatbots, however, a SAL system has sophisticated, robust, real-time multi-modal interaction capabilities – in an ongoing interaction, it detects the emotional colouring communicated by the user, and adapts its responses to that. In SAL, the system's responses are themselves emotionally loaded, and are chosen to propel the user towards an emotional state that the system currently 'wants' the user to be in.