Second-order Optimization over the Multivariate Gaussian Distribution - Giovanni Pistone, Luigi Malagò
-
Author(s): Giovanni Pistone, Luigi Malagò
DOI URL: http://dx.doi.org/10.1007/978-3-319-25040-3_38
Video: http://www.youtube.com/watch?v=L_pYet3UoPs
Slides: Pistone_second-order optimization multivariate Gaussian.pdf
Presentation: https://www.see.asso.fr/node/14292
Creative Commons Attribution-ShareAlike 4.0 InternationalAbstract:
We discuss the optimization of the stochastic relaxation of a real-valued function, i.e., we introduce a new search space given by a statistical model and we optimize the expected value of the original function with respect to a distribution in the model. From the point of view of Information Geometry, statistical models are Riemannian manifolds of distributions endowed with the Fisher information metric, thus the stochastic relaxation can be seen as a continuous optimization problem defined over a differentiable manifold. In this paper we explore the second-order geometry of the exponential family, with applications to the multivariate Gaussian distributions, to generalize second-order optimization methods. Besides the Riemannian Hessian, we introduce the exponential and the mixture Hessians, which come from the dually flat structure of an exponential family. This allows us to obtain different Taylor formulæ according to the choice of the Hessian and of the geodesic used, and thus different approaches to the design of second-order methods, such as the Newton method.