Room: Conference Room 5


9:00-9:25 From Statistical Physics to Machine Learning and BackGuilio Biroli, ENS Paris

9:25-9:50 Training a quantum system to represent classical dataBert Kappen, Radboud University Nijmegen

9:50-10:15 Exploiting quantum fluctuations for robust and efficient neural network trainingCarlo Baldassi, Bocconi University Milan, Riccardo Zecchina, Bocconi University Milan

10:15-10:40 A Quantum Algorithm for Linear ProgrammingMiguel Angel Delgado, Universidad Complutense Madrid

10:40-11:10 Coffee Break

11:10-11:35 Reinforcement learning for quantum technologiesFlorian Marquardt, Max Planck Institute for the science of light, Erlangen

11:35-12:00 High-dimensional random landscapes: critical points statistics and geometrical transitionsValentina Ros, Paris Saclay

12:00-12:25 TBARemis Monasson, ENS Paris

12:25-14:00 Lunch

14:00-14:25 Machine Learning for Molecules and MaterialsMatthias Rupp, Fritz Haber Institute, Berlin

14:25-14:50 Optimal Quantum Control with Digitized Quantum AnnealingGiuseppe Santoro, Sissa Trieste

14:50-15:20 Tea Break

15:20-15:45 Quantum machine learning: a classical perspectiveAndrea Rocchetto, Oxford University

15:45-16:10 TBAFlorent Krzakala, ENS Paris

16:30-18:30 Internal discussion Ellis program Quantum and pysics based machine learning



Abstracts


From Statistical Physics to Machine Learning and Back

Abstract

In this short talk I would like to address and discuss two different research themes: (1) Of dynamics and landscapes in Machine Learning: understanding the interplay between the geometry of the risk, loss, energy landscape and the dynamics within it is a central question in several problems of high-dimensional statistics as well as for deep neural networks. I will outline questions, directions and some ongoing works on these topics. (2) Machine Learning and Quantum Many Body Physics. Deep Neural Networks are a promising tool to study strongly interacting quantum particle systems. A main challenge that I would like to address consists in understanding the “complexity” of physical many-body wave-functions and the expressivity power of DNNs to represent them, and how all that is related to the nature of quantum phases, e.g. to many-body localisation, integrability, ground state vs high-energy state, etc.



Training a quantum system to represent classical data

Abstract

Machine learning is all about representing data in high dimensional probability models. A key computational bottleneck is the statistical inference, to compute statistics in these models, which is often done by time consuming Monte Carlo sampling. In principle, quantum systems could provide an alternative for these computations. If one can implement a probabilty distribution in the quantum state, the statistics can be obtained by repeated measurement and thus accellerate the inference computation. This could potentially be realized by a form of Noisy Intermediate-Scale Quantum (NISQ) technology. In this talk we show how the quantum Boltzmann machine (QBM) can represent a classical data distribution as the ground state of a quantum Hamiltonian system. The QBM can learn many more supervised and unsupervised problems than the classical Boltzmann Machine. In addition to computational efficiency, the quantum implementation may also yield novel functionality. The quantum state represents quantum statistics that result from entanglement and signal non-local events that violate the Bell inequality and increase the mutual information between sub systems. At the same time, these statistics are fully consistent with the learned classical data distribution. We propose to investigate how these quantum features can be exploited in machine learning applications.



Exploiting quantum fluctuations for robust and efficient neural network training

Abstract

Training a neural network on a classification task typically amounts at optimizing a highly non-convex objective function. The analysis of simple (but phenomenologically rich) models show that efficient learning is made possible by the existence of rare but very dense regions of optimal configurations, which have appealing robustness and generalization properties. Learning algorithms can be explicitly designed to target such regions. In a recent work we have showed that Quantum Annealing is also attracted to these states and therefore can be used as an efficient tool for training neural networks (leading to an exponential speed-up compared with classical Simulated Annealing). More generally, our theory on how to efficiently target these regions could provide efficient algorithmic solutions to non-standard architectures based on custom hardware.



Reinforcement learning for quantum technologies

Abstract

Modern tools of reinforcement learning can be used to have neural networks discover from scratch quantum feedback strategies. This allows to find optimized, hardware-tailored schemes. I will illustrate this for the case of quantum error correction, where a classical network agent learns to apply unitary gates and measurements to a noisy quantum memory, trying to prolong the memory's coherence time.



High-dimensional random landscapes: critical points statistics and geometrical transitions

Abstract

Understanding the statistical properties of complex, high-dimensional functionals (or landscapes) is a central problem in many contexts, including computer science and machine learning. In this short talk I will focus on the statistical distribution of the stationary points of random landscapes, which play a crucial role in determining the evolution of local dynamics within the landscape. I will mention analytical methods to compute this statistics and discuss an application to a “signal vs noise” problem relevant in the context of high-dimensional statistical inference.



Machine Learning for Molecules and Materials

Abstract

Computational and experimental study of atomistic systems is at the heart of physics, chemistry and materials science, but necessary measurements are often laborious and costly to obtain. We are interested in machine learning for prediction and analysis of such data to increase the number of accessible systems and enable accelerated study and design of atomistic systems at scale and reduced costs. An example is accurate interpolation between quantum-mechanical numerical simulations, enabling, for example, the screening of large molecular or materials spaces for novel compounds with tailored properties. I will provide a brief overview of our contributions in this area, emphasizing distinctive traits of this setting and the role of domain knowledge, as well as two specific examples of our work: The many-body tensor representation, an exact representation of finite and periodic atomistic systems, and, rigorous quantitative assessment of prediction errors and predictive uncertainties of machine learning models for molecules and materials.



Optimal Quantum Control with Digitized Quantum Annealing

Abstract

Various optimization problems that arise naturally in science are frequently solved by heuristic algorithms. One of the recent quantum-based algorithms, potentially promising for the implementation on near-term devices, is the Quantum Approximate Optimization Algorithm (QAOA) introduced by Farhi et al. (arXiv:1411.4028). I will give an overview of the algorithm, discussing its connection with Quantum Annealing and Quantum Control. In particular, I will show how a digitized version of QA can be made optimal, realizing the best possible solution allowed by quantum mechanics in the shortest time, and without spectral information. I will illustrate this on a simple benchmark problem, MaxCut on a 2-regular graph, equivalent to an unfrustrated antiferromagnetic Ising chain, with the standard transverse field used to introduce quantum fluctuations.



Quantum machine learning: a classical perspective

Abstract

Can quantum computers speed-up machine learning algorithms? During this talk I would like to address this question from the perspective of classical learning theory. First, I will introduce the quantum PAC model, a mathematical framework for rigorously formulating quantum learning problems, and present a case where quantum resources can give a quasi-exponential speedup. Second, I will discuss how randomised numerical linear algebra techniques that have been developed for machine learning tasks, such as the Nyström method, can be used to efficiently approximate quantum Hamiltonian evolutions.