Descripción del proyecto
Thanks to developments in computing, sensing and communication technology, great progress has been made in the control of robots affixed to the factory floor, mobile robots in controlled environments, and self-driving cars in medium-complexity environments. However, mobile robots are not yet capable of seamless interaction in human populated environments, primarily due to the high complexity of modeling and reasoning over the effect that robot actions have on other robots and humans, and the inherent uncertainty. Humans rely on intuition, namely the ability to understand interaction instinctively, without the need for conscious reasoning. However, for control systems it is notoriously difficult to model intuition, and to do so with safety guarantees with respect to physical constraints. My goal is to provide mobile robots with the ability to reason about their coordination with other agents and the associated risks, take appropriate actions and continuously reassess.
I will reach this goal by developing an innovative approach across the boundaries of Motion Planning, Multi-Robot Task Assignment and Machine Learning. In INTERACT I propose a holistic view on the interaction of mobile robots and humans, where I will consider multiple spatio-temporal granularities ranging from individual interactions to the interaction of a robot fleet with humans in a city, and from short term (local) to long term (global) effects of the interaction. Robots will use past experience to learn local and global intuition models of their interaction with the environment. These intuition models will be integrated in novel uncertainty-aware optimization methods to compute safe interaction-aware trajectories, task assignments and routes for mobile robots.
INTERACT will lay the foundation for intuitive multi-robot interaction, make it possible for teams of mobile robots to safely interact in human-centric environments and enable a new level of automation in factories and cities.