Quantum AI: From near-term to fault-tolerance
Junyu Liu, IBM/University of Chicago
Quantum machine learning, namely, running machine learning algorithms on quantum devices, has been considered a flag-ship application of quantum computing. In this talk, we will describe two perspectives of quantum machine learning: near-term algorithms and fault-tolerant algorithms. In the near-term realizations, I will discuss applications of variational quantum circuits in machine learning problems, and how a theory of quantum neural tangent kernel could be an analytic principle to optimize quantum neural networks. In the fault-tolerant realizations with quantum error correction, I will briefly discuss some ongoing works with end-to-end applications of the HHL algorithm that provides a provable, generic quantum advantage to a class of machine learning problems. Our works show that fundamental physics research, such as chaos and dissipation, quantum field theory and quantum gravity, could be helpful for important and timely problems of (quantum) machine learning algorithm designs.