Even after three decades of research on human-computer interaction (HCI), current general-purpose user interfaces (UI) still lack the ability to attribute mental states to their users, i.e. they fail to understand users' intention...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Eyes4ICU
Eyes for Information, Communication, and Understanding
Cerrado
PID2020-118504GB-I00
UNA INTELIGENCIA ARTIFICIAL VISUAL CENTRADA EN EL USUARIO
74K€
Cerrado
TIN2008-06890-C02-02
INTELIGENCIA AMBIENTAL PARA SU INTEGRACION EN SISTEMAS DE CO...
109K€
Cerrado
MARIE
Multimodal Activity Recognition for Interactive Environments
168K€
Cerrado
AI-CU
Automated Improvement of Continuous User interfaces
150K€
Cerrado
Información proyecto ANTICIPATE
Duración del proyecto: 60 meses
Fecha Inicio: 2019-01-30
Fecha Fin: 2024-01-31
Líder del proyecto
UNIVERSITY OF STUTTGART
No se ha especificado una descripción o un objeto social para esta compañía.
TRL
4-5
Presupuesto del proyecto
1M€
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
Even after three decades of research on human-computer interaction (HCI), current general-purpose user interfaces (UI) still lack the ability to attribute mental states to their users, i.e. they fail to understand users' intentions and needs and to anticipate their actions. This drastically restricts their interactive capabilities.
ANTICIPATE aims to establish the scientific foundations for a new generation of user interfaces that pro-actively adapt to users' future input actions by monitoring their attention and predicting their interaction intentions - thereby significantly improving the naturalness, efficiency, and user experience of the interactions. Realising this vision of anticipatory human-computer interaction requires groundbreaking advances in everyday sensing of user attention from eye and brain activity. We will further pioneer methods to predict entangled user intentions and forecast interactive behaviour with fine temporal granularity during interactions in everyday stationary and mobile settings. Finally, we will develop fundamental interaction paradigms that enable anticipatory UIs to pro-actively adapt to users' attention and intentions in a mindful way. The new capabilities will be demonstrated in four challenging cases: 1) mobile information retrieval, 2) intelligent notification management, 3) Autism diagnosis and monitoring, and 4) computer-based training.
Anticipatory human-computer interaction offers a strong complement to existing UI paradigms that only react to user input post-hoc. If successful, ANTICIPATE will deliver the first important building blocks for implementing Theory of Mind in general-purpose UIs. As such, the project has the potential to drastically improve the billions of interactions we perform with computers every day, to trigger a wide range of follow-up research in HCI as well as adjacent areas within and outside computer science, and to act as a key technical enabler for new applications, e.g. in healthcare and education.