• cschreiner

    Author: Dominique Spehner
    Institution: Université Joseph Fourier, Grenoble, Institut Fourier, France.
    Website: https://www-fourier.ujf-grenoble.fr/~spehner/
    Video: http://www.youtube.com/watch?v=5Nj5afyivI8
    Slides: https://drive.google.com/open?id=0B0QKxsVtOaTiR3lWZ1dUSVlhakE
    Presentation: https://www.see.asso.fr/node/14277
    Creative Commons Attribution-ShareAlike 4.0 International

    Abstract:
    I will show that the set of states of a quantum system with a finite-dimensional Hilbert space can be equipped with various Riemannian distances having nice properties from a quantum information viewpoint, namely they are contractive under all physically allowed operations on the system. The corresponding metrics are quantum analogs of the Fisher metric and have been classified by D. Petz. Two distances are particularly relevant physically: the Bogoliubov-Kubo-Mori distance studied by R. Balian, Y. Alhassid and H. Reinhardt, and the Bures distance studied by A. Uhlmann and by S.L. Braunstein and C.M. Caves. The latter gives the quantum Fisher information playing an important role in quantum metrology. A way to measure the amount of quantum correlations (entanglement or quantum discord) in bipartite systems (that is, systems composed of two parties) with the help of these distances will be also discussed.

    References:

    • D. Petz, Monotone Metrics on Matrix Spaces, Lin. Alg. and its Appl. 244, 81-96 (1996)
    • R. Balian, Y. Alhassid, and H. Reinhardt, Dissipation in many-body systems: a geometric approach based on information theory, Phys. Rep. 131, 1 (1986)
    • R. Balian, The entropy-based quantum metric, Entropy 2014 16(7), 3878-3888 (2014)
    • A. Uhlmann, The ``transition probability'' in the state space of a *-algebra, Rep. Math. Phys. 9, 273-279 (1976)
    • S.L. Braunstein and C.M. Caves, Statistical Distance and the Geometry of Quantum States, Phys. Rev. Lett. 72, 3439-3443 (1994)
    • D. Spehner, Quantum correlations and Distinguishability of quantum states, J. Math. Phys. 55 (2014), 075211

    Bio:

    • Diplôme d'Études Approfondies (DEA) in Theoretical Physics at the École Normale Supérieure de Lyon, 1994
    • Civil Service (Service National de la Coopération), Technion Institute of Technology, Haifa, Israel, 1995-1996
    • PhD in Theoretical Physics, Université Paul Sabatier, Toulouse, France, 1996-2000.
    • Postdoctoral fellow, Pontificia Universidad Católica, Santiago, Chile, 2000-2001
    • Research Associate, University of Duisburg-Essen, Germany, 2001-2005
    • Maître de Conférences, Université Joseph Fourier, Grenoble, France, 2005-present
    • Habilitation à diriger des Recherches (HDR), Université Grenoble Alpes, 2015
    • Member of the Institut Fourier (since 2005) and the Laboratoire de Physique et Modélisation des Milieux Condensés (since 2013) of the university Grenoble Alpes, France

    Spehner.jpg

    posted in Short Courses - Geometry on the set of quantum states and quantum correlations read more
  • cschreiner

    Author(s): Marc Arnaudon
    Institution: Institut de Mathématiques de Bordeaux (IMB), CNRS : UMR 5251, Université de Bordeaux, France
    Website: http://www.math.u-bordeaux1.fr/~marnaudo/
    Video: http://www.youtube.com/watch?v=1mKs_akkEuw
    Slides: Arnaudon_Stochastic EulerPoincare reduction.pdf
    Presentation: https://www.see.asso.fr/node/13650
    Creative Commons Attribution-ShareAlike 4.0 International

    Abstract:
    We will prove a Euler-Poincaré reduction theorem for stochastic processes taking values in a Lie group, which is a generalization of the Lagrangian version of reduction and its associated variational principles. We will also show examples of its application to the rigid body and to the group of diffeomorphisms, which includes the Navier-Stokes equation on a bounded domain and the Camassa-Holm equation.

    References:

    • M. Arnaudon, A.B. Cruzeiro and X. Chen, "Stochastic Euler-Poincaré Reduction", Journal of Mathematical Physics, to appear
    • V. I. Arnold and B. Khesin, "Topological methods in hydrodynamics", Applied Math. Series 125, Springer (1998).
    • J. M. Bismut, "Mécanique aléatoire", Lecture Notes in Mathematics, 866, Springer (1981).
    • D.G. Ebin and J.E. Marsden, "Groups of diffeomorphisms and the motion of an incompressible fluid", Ann of Math. 92 (1970), 102--163.
    • J. E. Marsden and T. S. Ratiu, "Introduction to Mechanics and Symmetry: a basic exposition of classical mechanical systems", Springer, Texts in Applied Math. (2003).

    Bio:
    Marc Arnaudon was born in France in 1965. He graduated from Ecole Normale Supérieure de Paris, France, in 1991. He received the PhD degree in mathematics and the Habilitation à diriger des Recherches degree from Strasbourg University, France, in January 1994 and January 1998 respectively. After postdoctoral research and teaching at Strasbourg, he began in September 1999 a full professor position in the Department of Mathematics at Poitiers University, France, where he was the head of the Probability Research Group. In January 2013 he left Poitiers and joined the Department of Mathematics of Bordeaux University, France, where he is a full professor in mathematics.
    Prof. Arnaudon is an expert in stochastic differential geometry and stochastic calculus in manifolds, he has published over 50 articles in mathematical and physical journals.

    Arnaudon.jpg

    posted in Keynotes Presentations read more
  • cschreiner

    Author(s): Anna-Lena Kißlinger, Wolfgang Stummer
    DOI URL: http://dx.doi.org/10.1007/978-3-319-25040-3_74
    Video: http://www.youtube.com/watch?v=lmmIXF0ziCk
    Slides: https://drive.google.com/open?id=0B0QKxsVtOaTib0V6RTNodGlKbUk
    Presentation: https://www.see.asso.fr/node/14296
    Creative Commons Attribution-ShareAlike 4.0 International

    Abstract:
    Scaled Bregman distances SBD have turned out to be useful tools for simultaneous estimation and goodness-of-fit-testing in parametric models of random data (streams, clouds). We show how SBD can additionally be used for model preselection (structure detection), i.e. for finding appropriate candidates of model (sub)classes in order to support a desired decision under uncertainty. For this, we exemplarily concentrate on the context of nonlinear recursive models with additional exogenous inputs; as special cases we include nonlinear regressions, linear autoregressive models (e.g. AR, ARIMA, SARIMA time series), and nonlinear autoregressive models with exogenous inputs (NARX). In particular, we outline a corresponding information-geometric 3D computer-graphical selection procedure. Some sample-size asymptotics is given as well.

    posted in Geometry of Time Series and Linear Dynamical systems read more
  • cschreiner

    Author(s): Alain Sarlette
    DOI URL: http://dx.doi.org/10.1007/978-3-319-25040-3_73
    Video: http://www.youtube.com/watch?v=tbUBxuawaMg
    Slides: Sarlette_operational viewpoint.pdf
    Presentation: https://www.see.asso.fr/node/14295
    Creative Commons Attribution-ShareAlike 4.0 International

    Abstract:
    This paper highlights some more examples of maps that follow a recently introduced “symmetrization” structure behind the average consensus algorithm. We review among others some generalized consensus settings and coordinate descent optimization.

    posted in Geometry of Time Series and Linear Dynamical systems read more
  • cschreiner

    Author(s): Frank Nielsen, Gautier Marti, Philippe Donnat, Philippe Very
    DOI URL: http://dx.doi.org/10.1007/978-3-319-25040-3_72
    Video: http://www.youtube.com/watch?v=wgb6olnmO50
    Slides: https://drive.google.com/open?id=0B0QKxsVtOaTiYUpPWkc2WkdRaDA
    Presentation: https://www.see.asso.fr/node/14294
    Creative Commons Attribution-ShareAlike 4.0 International

    Abstract:
    We present in this paper a novel non-parametric approach useful for clustering independent identically distributed stochastic processes. We introduce a pre-processing step consisting in mapping multivariate independent and identically distributed samples from random variables to a generic non-parametric representation which factorizes dependency and marginal distribution apart without losing any information. An associated metric is defined where the balance between random variables dependency and distribution information is controlled by a single parameter. This mixing parameter can be learned or played with by a practitioner, such use is illustrated on the case of clustering financial time series. Experiments, implementation and results obtained on public financial time series are online on a web portal http://www.datagrapple.com .

    posted in Geometry of Time Series and Linear Dynamical systems read more
  • cschreiner

    Author(s): Garvesh Raskutti, Sayan Mukherjee
    DOI URL: http://dx.doi.org/10.1007/978-3-319-25040-3_39
    Video: http://www.youtube.com/watch?v=PKujdGuu5Bc
    Slides: Monod_Information geomerty mirror descent.pdf
    Presentation: https://www.see.asso.fr/node/14293
    Creative Commons Attribution-ShareAlike 4.0 International

    Abstract:
    We prove the equivalence of two online learning algorithms, mirror descent and natural gradient descent. Both mirror descent and natural gradient descent are generalizations of online gradient descent when the parameter of interest lies on a non-Euclidean manifold. Natural gradient descent selects the steepest descent direction along a Riemannian manifold by multiplying the standard gradient by the inverse of the metric tensor. Mirror descent induces non-Euclidean structure by solving iterative optimization problems using different proximity functions. In this paper, we prove that mirror descent induced by a Bregman divergence proximity functions is equivalent to the natural gradient descent algorithm on the Riemannian manifold in the dual coordinate system.We use techniques from convex analysis and connections between Riemannian manifolds, Bregman divergences and convexity to prove this result. This equivalence between natural gradient descent and mirror descent, implies that (1) mirror descent is the steepest descent direction along the Riemannian manifold corresponding to the choice of Bregman divergence and (2) mirror descent with log-likelihood loss applied to parameter estimation in exponential families asymptotically achieves the classical Cramér-Rao lower bound.

    posted in Information Geometry Optimization read more
  • cschreiner

    Author(s): Giovanni Pistone, Luigi Malagò
    DOI URL: http://dx.doi.org/10.1007/978-3-319-25040-3_38
    Video: http://www.youtube.com/watch?v=L_pYet3UoPs
    Slides: Pistone_second-order optimization multivariate Gaussian.pdf
    Presentation: https://www.see.asso.fr/node/14292
    Creative Commons Attribution-ShareAlike 4.0 International

    Abstract:
    We discuss the optimization of the stochastic relaxation of a real-valued function, i.e., we introduce a new search space given by a statistical model and we optimize the expected value of the original function with respect to a distribution in the model. From the point of view of Information Geometry, statistical models are Riemannian manifolds of distributions endowed with the Fisher information metric, thus the stochastic relaxation can be seen as a continuous optimization problem defined over a differentiable manifold. In this paper we explore the second-order geometry of the exponential family, with applications to the multivariate Gaussian distributions, to generalize second-order optimization methods. Besides the Riemannian Hessian, we introduce the exponential and the mixture Hessians, which come from the dually flat structure of an exponential family. This allows us to obtain different Taylor formulæ according to the choice of the Hessian and of the geodesic used, and thus different approaches to the design of second-order methods, such as the Newton method.

    posted in Information Geometry Optimization read more
  • cschreiner

    Author(s): Christophe Saint-Jean, Frank Nielsen
    DOI URL: http://dx.doi.org/10.1007/978-3-319-25040-3_37
    Video: http://www.youtube.com/watch?v=W4H5ccCNqck
    Slides: Saint-Jean_Online kMLE.pdf
    Presentation: https://www.see.asso.fr/node/14291
    Creative Commons Attribution-ShareAlike 4.0 International

    Abstract:
    This paper address the problem of online learning finite statistical mixtures of exponential families. A short review of the Expectation-Maximization (EM) algorithm and its online extensions is done. From these extensions and the description of the k-Maximum Likelihood Estimator (k-MLE), three online extensions are proposed for this latter. To illustrate them, we consider the case of mixtures of Wishart distributions by giving details and providing some experiments.

    posted in Information Geometry Optimization read more
  • cschreiner

    Author(s): James Tao, Jun Zhang
    DOI URL: http://dx.doi.org/10.1007/978-3-319-25040-3_36
    Video: http://www.youtube.com/watch?v=ebmReSVXZ1E
    Slides: Tao_Transformation coupling relations.pdf
    Presentation: https://www.see.asso.fr/node/14290
    Creative Commons Attribution-ShareAlike 4.0 International

    Abstract:
    The statistical structure on a manifold M is predicated upon a special kind of coupling between the Riemannian metric g and a torsion-free affine connection ∇ on the TM, such that ∇ g is totally symmetric, forming, by definition, a “Codazzi pair” { ∇ , g}. In this paper, we first investigate various transformations of affine connections, including additive translation (by an arbitrary (1,2)-tensor K), multiplicative perturbation (through an arbitrary invertible operator L on TM), and conjugation (through a non-degenerate two-form h). We then study the Codazzi coupling of ∇ with h and its coupling with L, and the link between these two couplings. We introduce, as special cases of K-translations, various transformations that generalize traditional projective and dual-projective transformations, and study their commutativity with L-perturbation and h-conjugation transformations. Our derivations allow affine connections to carry torsion, and we investigate conditions under which torsions are preserved by the various transformations mentioned above. Our systematic approach establishes a general setting for the study of Information Geometry based on transformations and coupling relations of affine connections – in particular, we provide a generalization of conformal-projective transformation.

    posted in Information Geometry Optimization read more
  • cschreiner

    Author(s): Nihat Ay, Shun-Ichi Amari
    DOI URL: http://dx.doi.org/10.1007/978-3-319-25040-3_35
    Video: http://www.youtube.com/watch?v=Pu8M9Fu7_fM
    Slides: Amari_standard divergence.pdf
    Presentation: https://www.see.asso.fr/node/14289
    Creative Commons Attribution-ShareAlike 4.0 International

    Abstract:
    A divergence function defines a Riemannian metric G and dually coupled affine connections (∇, ∇  ∗ ) with respect to it in a manifold M. When M is dually flat, a canonical divergence is known, which is uniquely determined from {G, ∇, ∇  ∗ }. We search for a standard divergence for a general non-flat M. It is introduced by the magnitude of the inverse exponential map, where α = -(1/3) connection plays a fundamental role. The standard divergence is different from the canonical divergence.

    posted in Information Geometry Optimization read more

Internal error.

Oops! Looks like something went wrong!