Joint Structures and Common Foundations of Statistical Physics, Information Geometry and Inference for Learning (SP+IG'20)

Joint Structures and Common Foundations of Statistical Physics, Information Geometry and Inference for Learning (SP+IG'20)
OFFICIAL WEBPAGE
Date: 26th July to 31st July 2020
Location: Ecole de Physique des Houches
Ecole de Physique des Houches https://houches.univgrenoblealpes.fr/
149 Chemin de la Côte, F74310 Les Houches, France
(+33/0) 4 57 04 10 40
Download the poster in pdfTo submit a short paper or poster, please use the Easychair conference system:
https://easychair.org/conferences/?conf=spig20Scientific rationale:
In the middle of the last century, Léon Brillouin in "The Science and The Theory of Information" or André BlancLapierre in "Statistical Mechanics" forged the first links between the Theory of Information and Statistical Physics as precursors. In the context of Artificial Intelligence, machine learning algorithms use more and more methodological tools coming from the Physics or the Statistical Mechanics. The laws and principles that underpin this Physics can shed new light on the conceptual basis of Artificial Intelligence. Thus, the principles of Maximum Entropy, Minimum of Free Energy, GibbsDuhem's Thermodynamic Potentials and the generalization of François Massieu's notions of characteristic functions enrich the variational formalism of machine learning. Conversely, the pitfalls encountered by Artificial Intelligence to extend its application domains, question the foundations of Statistical Physics, such as the construction of stochastic gradient in large dimension, the generalization of the notions of Gibbs densities in spaces of more elaborate representation like data on homogeneous differential or symplectic manifolds, Lie groups, graphs, tensors, .... Sophisticated statistical models were introduced very early to deal with unsupervised learning tasks related to IsingPotts models (the IsingPotts model defines the interaction of spins arranged on a graph) of Statistical Physics. and more generally the Markov fields. The Ising models are associated with the theory of Mean Fields (study of systems with complex interactions through simplified models in which the action of the complete network on an actor is summarized by a single mean interaction in the sense of the mean field). The porosity between the two disciplines has been established since the birth of Artificial Intelligence with the use of Boltzmann machines and the problem of robust methods for calculating partition function. More recently, gradient algorithms for neural network learning use largescale robust extensions of the natural gradient of Fisherbased Information Geometry (to ensure reparameterization invariance), and stochastic gradient based on the Langevin equation (to ensure regularization), or their coupling called "Natural Langevin Dynamics". Concomitantly, during the last fifty years, Statistical Physics has been the object of new geometrical formalizations (contact or symplectic geometry, ...) to try to give a new covariant formalization to the thermodynamics of dynamic systems. We can mention the extension of the symplectic models of Geometric Mechanics to Statistical Mechanics, or other developments such as Random Mechanics, Geometric Mechanics in its Stochastic version, Lie Groups Thermodynamic, and geometric modeling of phase transition phenomena. Finally, we refer to Computational Statistical Physics, which uses efficient numerical methods for largescale sampling and multimodal probability measurements (sampling of BoltzmannGibbs measurements and calculations of free energy, metastable dynamics and rare events, ...) and the study of geometric integrators (Hamiltonian dynamics, symplectic integrators, ...) with good properties of covariances and stability (use of symmetries, preservation of invariants, ...). Machine learning inference processes are just beginning to adapt these new integration schemes and their remarkable stability properties to increasingly abstract data representation spaces. Artificial Intelligence currently uses only a very limited portion of the conceptual and methodological tools of Statistical Physics. The purpose of this conference is to encourage constructive dialogue around a common foundation, to allow the establishment of new principles and laws governing the two disciplines in a unified approach. But, it is also about exploring new « chemins de traverse ».Organizers:
 Frédéric Barbaresco, THALES, KTD PCC, Palaiseau, France
 Silvère Bonnabel, Mines ParisTech, CAOR, Paris, France
 François GayBalmaz, Ecole Normale Supérieure Ulm, CNRS & LMD, Paris, France
 Patrick IglesiasZemmour, Université de Marseille, I2M, Marseille, France
 Bernhard Maschke, Université Claude Bernard, LAGEPP, Lyon, France
 Eric Moulines, Ecole Polytechnique, CMAP, Palaiseau, France
 Frank Nielsen, Sony Computer Science Laboratories, Tokyo, Japan and Ecole Polytechnique, France
 Gery de Saxcé, Université de Lille, LAM3, Lille, France