Automated Improvement of Continuous User interfaces
We propose to develop two tools for creating, in a systematic way, better user interfaces based on continuous, non-symbolic actions, such as swipes on a touch screen, 3-D motions with a hand-held device, or breath patterns in a us...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Proyectos interesantes
BODY-UI
Using Embodied Cognition to Create the Next Generations of B...
2M€
Cerrado
INFERENCEHCI
Inference and machine learning methods in human computer int...
161K€
Cerrado
ANTICIPATE
Anticipatory Human Computer Interaction
1M€
Cerrado
UBIQUO
PLATFORM TO DEVELOP AND DISTRIBUTE GESTURE RECOGNITION BASED...
71K€
Cerrado
RTI2018-099235-B-I00
MODELADO DE USUARIO PARA PERSONALIZACION DE INTERFAZ GUIADO...
38K€
Cerrado
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
We propose to develop two tools for creating, in a systematic way, better user interfaces based on continuous, non-symbolic actions, such as swipes on a touch screen, 3-D motions with a hand-held device, or breath patterns in a user interface for otherwise paralyzed patients. The tools are based on two experimental/computational techniques developed in the ABACUS project: iterated learning and social coordination.
In iterated learning, sets of signals produced by one user are learned and reproduced by another user. The reproductions are then in turn learned by the next user. In the ABACUS project, it has been shown that this results in more learnable sets of signals. We propose to show how this can be applied to creating learnable and usable signals in a systematic way when design a user interface for a device that allows continuous actions.
In social coordination, it has been shown that signals become simplified and more abstract when people communicate over an extended period of time. The ABACUS project has developed techniques to detect and quantify this. We propose to show how these can be used for a user interface that adapts to its user. This will allow novice users to use more extended and therefore more learnable versions of actions, while the system adapts when users become more adept at using the interface and reduce their actions. Because the system is adaptive, the user is not constrained in how they do this.
Concretely, we propose to implement these two tools, investigate how they can be used optimally and advertise them to
interested companies, starting with ones with which we have contact, but extending our network at the start of the project through a business case development. In order to disseminate the results we propose to involve a user committee and organize one or more workshops.