Groups

cschreiner
Author: Dominique Spehner
Institution: Université Joseph Fourier, Grenoble, Institut Fourier, France.
Website: https://wwwfourier.ujfgrenoble.fr/~spehner/
Video: http://www.youtube.com/watch?v=5Nj5afyivI8
Slides: https://drive.google.com/open?id=0B0QKxsVtOaTiR3lWZ1dUSVlhakE
Presentation: https://www.see.asso.fr/node/14277
Creative Commons AttributionShareAlike 4.0 InternationalAbstract:
I will show that the set of states of a quantum system with a finitedimensional Hilbert space can be equipped with various Riemannian distances having nice properties from a quantum information viewpoint, namely they are contractive under all physically allowed operations on the system. The corresponding metrics are quantum analogs of the Fisher metric and have been classified by D. Petz. Two distances are particularly relevant physically: the BogoliubovKuboMori distance studied by R. Balian, Y. Alhassid and H. Reinhardt, and the Bures distance studied by A. Uhlmann and by S.L. Braunstein and C.M. Caves. The latter gives the quantum Fisher information playing an important role in quantum metrology. A way to measure the amount of quantum correlations (entanglement or quantum discord) in bipartite systems (that is, systems composed of two parties) with the help of these distances will be also discussed.References:
 D. Petz, Monotone Metrics on Matrix Spaces, Lin. Alg. and its Appl. 244, 8196 (1996)
 R. Balian, Y. Alhassid, and H. Reinhardt, Dissipation in manybody systems: a geometric approach based on information theory, Phys. Rep. 131, 1 (1986)
 R. Balian, The entropybased quantum metric, Entropy 2014 16(7), 38783888 (2014)
 A. Uhlmann, The ``transition probability'' in the state space of a *algebra, Rep. Math. Phys. 9, 273279 (1976)
 S.L. Braunstein and C.M. Caves, Statistical Distance and the Geometry of Quantum States, Phys. Rev. Lett. 72, 34393443 (1994)
 D. Spehner, Quantum correlations and Distinguishability of quantum states, J. Math. Phys. 55 (2014), 075211
Bio:
 Diplôme d'Études Approfondies (DEA) in Theoretical Physics at the École Normale Supérieure de Lyon, 1994
 Civil Service (Service National de la Coopération), Technion Institute of Technology, Haifa, Israel, 19951996
 PhD in Theoretical Physics, Université Paul Sabatier, Toulouse, France, 19962000.
 Postdoctoral fellow, Pontificia Universidad Católica, Santiago, Chile, 20002001
 Research Associate, University of DuisburgEssen, Germany, 20012005
 Maître de Conférences, Université Joseph Fourier, Grenoble, France, 2005present
 Habilitation à diriger des Recherches (HDR), Université Grenoble Alpes, 2015
 Member of the Institut Fourier (since 2005) and the Laboratoire de Physique et Modélisation des Milieux Condensés (since 2013) of the university Grenoble Alpes, France

cschreiner
Author(s): Marc Arnaudon
Institution: Institut de Mathématiques de Bordeaux (IMB), CNRS : UMR 5251, Université de Bordeaux, France
Website: http://www.math.ubordeaux1.fr/~marnaudo/
Video: http://www.youtube.com/watch?v=1mKs_akkEuw
Slides: Arnaudon_Stochastic EulerPoincare reduction.pdf
Presentation: https://www.see.asso.fr/node/13650
Creative Commons AttributionShareAlike 4.0 InternationalAbstract:
We will prove a EulerPoincaré reduction theorem for stochastic processes taking values in a Lie group, which is a generalization of the Lagrangian version of reduction and its associated variational principles. We will also show examples of its application to the rigid body and to the group of diffeomorphisms, which includes the NavierStokes equation on a bounded domain and the CamassaHolm equation.References:
 M. Arnaudon, A.B. Cruzeiro and X. Chen, "Stochastic EulerPoincaré Reduction", Journal of Mathematical Physics, to appear
 V. I. Arnold and B. Khesin, "Topological methods in hydrodynamics", Applied Math. Series 125, Springer (1998).
 J. M. Bismut, "Mécanique aléatoire", Lecture Notes in Mathematics, 866, Springer (1981).
 D.G. Ebin and J.E. Marsden, "Groups of diffeomorphisms and the motion of an incompressible fluid", Ann of Math. 92 (1970), 102163.
 J. E. Marsden and T. S. Ratiu, "Introduction to Mechanics and Symmetry: a basic exposition of classical mechanical systems", Springer, Texts in Applied Math. (2003).
Bio:
Marc Arnaudon was born in France in 1965. He graduated from Ecole Normale Supérieure de Paris, France, in 1991. He received the PhD degree in mathematics and the Habilitation à diriger des Recherches degree from Strasbourg University, France, in January 1994 and January 1998 respectively. After postdoctoral research and teaching at Strasbourg, he began in September 1999 a full professor position in the Department of Mathematics at Poitiers University, France, where he was the head of the Probability Research Group. In January 2013 he left Poitiers and joined the Department of Mathematics of Bordeaux University, France, where he is a full professor in mathematics.
Prof. Arnaudon is an expert in stochastic differential geometry and stochastic calculus in manifolds, he has published over 50 articles in mathematical and physical journals. 
cschreiner
Author(s): AnnaLena Kißlinger, Wolfgang Stummer
DOI URL: http://dx.doi.org/10.1007/9783319250403_74
Video: http://www.youtube.com/watch?v=lmmIXF0ziCk
Slides: https://drive.google.com/open?id=0B0QKxsVtOaTib0V6RTNodGlKbUk
Presentation: https://www.see.asso.fr/node/14296
Creative Commons AttributionShareAlike 4.0 InternationalAbstract:
Scaled Bregman distances SBD have turned out to be useful tools for simultaneous estimation and goodnessoffittesting in parametric models of random data (streams, clouds). We show how SBD can additionally be used for model preselection (structure detection), i.e. for finding appropriate candidates of model (sub)classes in order to support a desired decision under uncertainty. For this, we exemplarily concentrate on the context of nonlinear recursive models with additional exogenous inputs; as special cases we include nonlinear regressions, linear autoregressive models (e.g. AR, ARIMA, SARIMA time series), and nonlinear autoregressive models with exogenous inputs (NARX). In particular, we outline a corresponding informationgeometric 3D computergraphical selection procedure. Some samplesize asymptotics is given as well. 
cschreiner
Author(s): Alain Sarlette
DOI URL: http://dx.doi.org/10.1007/9783319250403_73
Video: http://www.youtube.com/watch?v=tbUBxuawaMg
Slides: Sarlette_operational viewpoint.pdf
Presentation: https://www.see.asso.fr/node/14295
Creative Commons AttributionShareAlike 4.0 InternationalAbstract:
This paper highlights some more examples of maps that follow a recently introduced “symmetrization” structure behind the average consensus algorithm. We review among others some generalized consensus settings and coordinate descent optimization. 
cschreiner
Author(s): Frank Nielsen, Gautier Marti, Philippe Donnat, Philippe Very
DOI URL: http://dx.doi.org/10.1007/9783319250403_72
Video: http://www.youtube.com/watch?v=wgb6olnmO50
Slides: https://drive.google.com/open?id=0B0QKxsVtOaTiYUpPWkc2WkdRaDA
Presentation: https://www.see.asso.fr/node/14294
Creative Commons AttributionShareAlike 4.0 InternationalAbstract:
We present in this paper a novel nonparametric approach useful for clustering independent identically distributed stochastic processes. We introduce a preprocessing step consisting in mapping multivariate independent and identically distributed samples from random variables to a generic nonparametric representation which factorizes dependency and marginal distribution apart without losing any information. An associated metric is defined where the balance between random variables dependency and distribution information is controlled by a single parameter. This mixing parameter can be learned or played with by a practitioner, such use is illustrated on the case of clustering financial time series. Experiments, implementation and results obtained on public financial time series are online on a web portal http://www.datagrapple.com . 
cschreiner
Author(s): Garvesh Raskutti, Sayan Mukherjee
DOI URL: http://dx.doi.org/10.1007/9783319250403_39
Video: http://www.youtube.com/watch?v=PKujdGuu5Bc
Slides: Monod_Information geomerty mirror descent.pdf
Presentation: https://www.see.asso.fr/node/14293
Creative Commons AttributionShareAlike 4.0 InternationalAbstract:
We prove the equivalence of two online learning algorithms, mirror descent and natural gradient descent. Both mirror descent and natural gradient descent are generalizations of online gradient descent when the parameter of interest lies on a nonEuclidean manifold. Natural gradient descent selects the steepest descent direction along a Riemannian manifold by multiplying the standard gradient by the inverse of the metric tensor. Mirror descent induces nonEuclidean structure by solving iterative optimization problems using different proximity functions. In this paper, we prove that mirror descent induced by a Bregman divergence proximity functions is equivalent to the natural gradient descent algorithm on the Riemannian manifold in the dual coordinate system.We use techniques from convex analysis and connections between Riemannian manifolds, Bregman divergences and convexity to prove this result. This equivalence between natural gradient descent and mirror descent, implies that (1) mirror descent is the steepest descent direction along the Riemannian manifold corresponding to the choice of Bregman divergence and (2) mirror descent with loglikelihood loss applied to parameter estimation in exponential families asymptotically achieves the classical CramérRao lower bound. 
cschreiner
Author(s): Giovanni Pistone, Luigi Malagò
DOI URL: http://dx.doi.org/10.1007/9783319250403_38
Video: http://www.youtube.com/watch?v=L_pYet3UoPs
Slides: Pistone_secondorder optimization multivariate Gaussian.pdf
Presentation: https://www.see.asso.fr/node/14292
Creative Commons AttributionShareAlike 4.0 InternationalAbstract:
We discuss the optimization of the stochastic relaxation of a realvalued function, i.e., we introduce a new search space given by a statistical model and we optimize the expected value of the original function with respect to a distribution in the model. From the point of view of Information Geometry, statistical models are Riemannian manifolds of distributions endowed with the Fisher information metric, thus the stochastic relaxation can be seen as a continuous optimization problem defined over a differentiable manifold. In this paper we explore the secondorder geometry of the exponential family, with applications to the multivariate Gaussian distributions, to generalize secondorder optimization methods. Besides the Riemannian Hessian, we introduce the exponential and the mixture Hessians, which come from the dually flat structure of an exponential family. This allows us to obtain different Taylor formulæ according to the choice of the Hessian and of the geodesic used, and thus different approaches to the design of secondorder methods, such as the Newton method. 
cschreiner
Author(s): Christophe SaintJean, Frank Nielsen
DOI URL: http://dx.doi.org/10.1007/9783319250403_37
Video: http://www.youtube.com/watch?v=W4H5ccCNqck
Slides: SaintJean_Online kMLE.pdf
Presentation: https://www.see.asso.fr/node/14291
Creative Commons AttributionShareAlike 4.0 InternationalAbstract:
This paper address the problem of online learning finite statistical mixtures of exponential families. A short review of the ExpectationMaximization (EM) algorithm and its online extensions is done. From these extensions and the description of the kMaximum Likelihood Estimator (kMLE), three online extensions are proposed for this latter. To illustrate them, we consider the case of mixtures of Wishart distributions by giving details and providing some experiments. 
cschreiner
Author(s): James Tao, Jun Zhang
DOI URL: http://dx.doi.org/10.1007/9783319250403_36
Video: http://www.youtube.com/watch?v=ebmReSVXZ1E
Slides: Tao_Transformation coupling relations.pdf
Presentation: https://www.see.asso.fr/node/14290
Creative Commons AttributionShareAlike 4.0 InternationalAbstract:
The statistical structure on a manifold M is predicated upon a special kind of coupling between the Riemannian metric g and a torsionfree affine connection ∇ on the TM, such that ∇ g is totally symmetric, forming, by definition, a “Codazzi pair” { ∇ , g}. In this paper, we first investigate various transformations of affine connections, including additive translation (by an arbitrary (1,2)tensor K), multiplicative perturbation (through an arbitrary invertible operator L on TM), and conjugation (through a nondegenerate twoform h). We then study the Codazzi coupling of ∇ with h and its coupling with L, and the link between these two couplings. We introduce, as special cases of Ktranslations, various transformations that generalize traditional projective and dualprojective transformations, and study their commutativity with Lperturbation and hconjugation transformations. Our derivations allow affine connections to carry torsion, and we investigate conditions under which torsions are preserved by the various transformations mentioned above. Our systematic approach establishes a general setting for the study of Information Geometry based on transformations and coupling relations of affine connections – in particular, we provide a generalization of conformalprojective transformation. 
cschreiner
Author(s): Nihat Ay, ShunIchi Amari
DOI URL: http://dx.doi.org/10.1007/9783319250403_35
Video: http://www.youtube.com/watch?v=Pu8M9Fu7_fM
Slides: Amari_standard divergence.pdf
Presentation: https://www.see.asso.fr/node/14289
Creative Commons AttributionShareAlike 4.0 InternationalAbstract:
A divergence function defines a Riemannian metric G and dually coupled affine connections (∇, ∇ ∗ ) with respect to it in a manifold M. When M is dually flat, a canonical divergence is known, which is uniquely determined from {G, ∇, ∇ ∗ }. We search for a standard divergence for a general nonflat M. It is introduced by the magnitude of the inverse exponential map, where α = (1/3) connection plays a fundamental role. The standard divergence is different from the canonical divergence.