Abstract  Information and topology

Organisers: Pierre Baudot, Daniel Bennequin, Michel Boyom, Herbert Gangl, Matilde Marcolli, John Terilla
List of speakers :
 Pierre Baudot Inserm, UNIS U1072, Marseille, France.
 Daniel Bennequin Université Paris DiderotParis 7, UFR de Mathématiques, Equipe Géométrie et Dynamique, Paris, France.
 José Ignacio Burgos Gil ICMAT (CSIC), Madrid, Spain
 Michel Boyom Université du LanguedocMontpellier II, France.
 Philippe ElbazVincent Institut Fourier, Grenoble, France.
 Tom Leinster School of Mathematics, University of Edinburgh, Edinburgh.
 Matilde Marcolli Mathematics Department, Caltech, Pasadena, USA.
 John Terilla, Queens College, USA.
Abstract:
A classical result of Cencov [1] in information geometry established that the FisherRao metric is the unique metric (up to a constant) on the space of finite probability densities that is invariant under the action of the diffeomorphism group. This result was extended to infinite sample spaces via ntensorisation by Ay, Jost, Le and Schwachhofer [2]. Entropy first appeared in the computation of the degree one homology of the discrete group SL2 over C with coefficients in the adjoint action by choosing a pertinent definition of the derivative of the BlochWigner dilogarithm [3]. It could be shown that the functional equation with 5terms of the dilogarithm implies the functional equation of information with 4terms. Then it was discovered that a finite truncated version of the logarithm appearing in cyclotomic studies also satisfied the functional equation of entropy, suggesting a higher degree generalisation of information, analog to polylogarithm [4]. This information generalisation was achieved by algebraic means that yet holds over finite fields [5], and further developed into framework where information functions appears as derivation [6]. a more geometric version in terms of algebraic cycles was found [7], introducing the notion of additive dilogarithm with respect to the notion of multiplicative structures [8]. On another side, after that entropy appeared in tropical and idempotent semiring analysis in the study of the extension of Witt semiring to the characteristic 1 limit [9], thermodynamic semiring and entropy operad could be constructed as deformation of the tropical semiring [10]. Introducing RotaBaxter algebras, It allowed to derive a renormalisation procedure. Further completing the entropic landscape in number theory, entropy was encountered as heightdegree function in the context of Arakelov geometry while following the program of relating the arithmetic geometry of toric varieties and convex analysis [11]. Defining the category of finite probability and using Fadeev axiomatization, it could be shown that the only family of function which has functorial property is Shannon information loss [12]. Introducing a deformation theoretic framework, and chain complex of random variables, a homotopy probability theory could be constructed for which the cumulants coincide with the morphisms of the homotopy algebras [13]. Using the geometrical and combinatorial structure of probability and random variable seen as partitions, the basement of the cohomology, topos and of a (quasi)operad of information could be constructed in a probabilistic setting, allowing to retrieve entropy uniquely as the first cohomology group. Mutualinformations arise from the consideration of the trivial action, and appear as differential operator, while their "nonpositive" extrema are giving rise to homotopy links [14]. In a series of papers and courses [15], M.Gromov reviews results on the subject and introduces to some geometrical formulation of information inequalities, as well as a complete program in current development rooted on Morse cohomology and homotopy.It is appealing and a very good sign that an important part of those developments has been achieved in parallel and independently with very different approaches and methods; the session will be one of the first occasion to gather the actors of the domain, and hence to promote the convergence and discuss open problems. Moreover, following these important extensions of the information and probability framework, It appears natural to ask now for a proper logical foundation of information theory, extending the Boolean world of information digit. Given those developments, motivic integration, topos and homotopy type, that are inherently geometrical logics, provide some candidate for such a program that we propose to discuss.
[1] Cencov, N.N. Statistical Decision Rules and Optimal Inference. Translations of Mathematical Monographs. 1982.
[2] Ay, N. and Jost, J. and Lê, H.V. and Schwachhöfer, L. Information geometry and sufficient statistics. Probability Theory and Related Fields 2015 PDF
[3] Cathelineau, J. Sur l’homologie de sl2 a coefficients dans l’action adjointe, Math. Scand., 63, 5186, 1988. PDF
[4] Kontsevitch, M. The 1+1/2 logarithm. Unpublished note, Reproduced in ElbazVincent & Gangl, 2002, 1995 PDF
[5] ElbazVincent, P., Gangl, H. On poly(ana)logs I., Compositio Mathematica, 130(2), 161214. 2002. PDF
[6] ElbazVincent, P., Gangl, H., Finite polylogarithms, their multiple analogues and the Shannon entropy. Vol. 9389 Lecture Notes in Computer Science. 277285, Archiv.
[7] Bloch S.; Esnault, H. An additive version of higher Chow groups, Annales Scientifiques de l’École Normale Supérieure. Volume 36, Issue 3, May–June 2003, Pages 463–477. PDF
[8] Bloch S.; Esnault, H. The Additive Dilogarithm, Documenta Mathematica Extra Volume : in Kazuya Kato’s Fiftieth Birthday., 131155. 2003. PDF
[9] Connes, A., Consani, C., Characteristic 1, entropy and the absolute point. preprint arXiv:0911.3537v1. 2009.
[10] Marcolli, M. & Thorngren, R. Thermodynamic Semirings, arXiv 10.4171 / JNCG/159, Vol. abs/1108.2874, 2011.
Marcolli, M. & Tedeschi, R. Entropy algebras and Birkhoff factorization, arXiv, Vol. abs/1108.2874, 2014.
[11] Burgos Gil J.I., Philippon P., Sombra M., Arithmetic geometry of toric varieties. Metrics, measures and heights, Astérisque 360. 2014 . PDF
[12] Baez, J.; Fritz, T. & Leinster, T. A Characterization of Entropy in Terms of Information Loss Entropy, 13, 19451957, 2011. PDF
Baez J. C.; Fritz, T. A Bayesian characterization of relative entropy. Theory and Applications of Categories, Vol. 29, No. 16, p. 422456. 2014. PDF
[13] DrummondCole, G.C., Park., J.S., Terrila, J., Homotopy probability theory I. J. Homotopy Relat. Struct. November 2013. PDF
DrummondCole, G.C., Park., J.S., Terrila, J., Homotopy probability theory II. J. Homotopy Relat. Struct. April 2014. PDF
Park., J.S., Homotopy theory of probability spaces I: classical independence and homotopy Lie algebras. Archiv . 2015
Tomasic, I., Independence, measure and pseudofinite fields. Selecta Mathematica, 12 271306. Archiv. 2006.
[14] Baudot P., Bennequin D. The homological nature of entropy. Entropy, 17, 166; 2015. PDF
[15] Gromov, M. In a Search for a Structure, Part 1: On Entropy, unpublished manuscript, 2013. PDF
Gromov, M. Symmetry, probability, entropy. Entropy 2015. PDF
Gromov, M. Morse Spectra, Homology Measures, Spaces of Cycles and Parametric Packing Problems, april 2015. PDF
[16]McMullen, C.T., Entropy and the clique polynomial, 2013. PDF