Antonia M Frassino (Frankfurt Institute for Advanced Studies, Germany) Deep learning and the universal quantum simulator, as meant by Feynmann, are machines able to build representations and can be used as efficient generative models. The representational power of the deep neural networks has been experimentally verified while its theoretical reasons behind their power are still unclear. The generative power of the universal quantum simulators has been proved rigorously, while their actual construction is a topic of active research in Physics.
Quantum states and the relative entropy (KL-divergence) allow unveiling the geometrical structure of the two computational models and how information shapes their representation space.
Here we draw a parallel between the two systems, addressing the problems and the open questions.
In particular, we focus on the landscape of the empirical risk in the training of deep neural networks and the role of entanglement entropy in quantum computation.
The purpose of the analysis is to study how information geometry models their links with Nature.
Anton Mallasto (University of Copenhagen, Danemark) : We introduce a novel framework for statistical analysis of populations of non-degenerate Gaussian processes (GPs), which are natural representations of uncertain curves. This allows inherent variation or uncertainty in function-valued data to be properly incorporated in the population analysis. Using the 2-Wasserstein metric we geometrize the space of GPs with L^2 mean and covariance functions over compact index spaces. We present results on existence and uniqueness of the barycenter of a population of GPs, as well as convergence of the metric and the barycenter of their finite-dimensional counterparts. This justifies practical computations. Finally, we demonstrate our framework through experimental validation on GP datasets representing brain connectivity and climate development.
Hiroshi Matsuzoe (Nagoya Institute of Technology, Japan) : A survey on infinite dimensional affine differential geometry and information geometry. The main object of affine differential geometry is to study hypersurfaces or immersions that are affinely congruent in an affine space. It is known that dual affine connections and statistical manifold structures naturally arise in this framework. For example, a statistical manifold structure of an exponential family is realized by an affine hypersurface immersion, and the Kullback-Leibler divergence coincides with the affine support function. The Legendre transformation can be discussed by the codimension two affine immersion of a special kind. Therefore, the geometry of dually flat spaces can be generalized by affine differential geometry. In this presentation, we would like to give a short survey about the relations between the infinite dimensional framework of affine differential geometry and information geometry. In particular, we would like to discuss the infinite dimensional affine differential geometry of the maximal exponential families and the alpha-families.