Descripción del proyecto
Large-scale quantum experiments do not work in isolation. Substantial classical computing power is required to control the experiment and process the results. This necessarily creates information-transmission bottlenecks at the interface between quantum and classical realms. These bottlenecks create scalability issues that prevent us from using existing architectures to the best of their capabilities and may even impair our ability to further scale up system sizes.
In this project, we adopt a unifying framework that takes into account all computing resources (quantum and classical). We develop quantum-to-classical converters to overcome information-transmission bottlenecks. Dubbed shadows, they leverage randomization, as well as quantum-enhanced readout strategies to obtain a succinct classical description of an underlying quantum system that can then be used to efficiently predict many features at once. The shadow paradigm is compatible with near-term quantum hardware and utilizes genuine quantum effects that do not have a classical counterpart. Building on these ideas, we also establish rigorous synergies between quantum experiments and classical machine learning. Shadow learning protocols use shadows to succinctly represent training data obtained from actual quantum experiments. A classical training stage then enables data-driven learning of genuine quantum phenomena. Finally, we develop new tools to ensure reliable execution on current quantum hardware, thus bridging the gap between theory and experiment.
My interdisciplinary skill set combines methods from modern computer science with quantum information and has already led to numerous high-impact contributions (e.g. 1 Nature Physics with more than 350 citations and 2 Science publications). These insights form the basis for this larger project, where we lay the foundation for scalable and practical quantum data processing and learning that can keep up and grow with future improvements in quantum technology.