Drag and Drop a photo, drag to position, and hit Save

Group Details

Geometric Science of Information

The objective of this group is to bring together pure/applied mathematicians, physicist and engineers, with common interest for Geometric tools and their applications. It notably aim to organize conferences and to promote collaborative european and international research projects, and diffuse research results on the related domains. It aims to organise conferences, seminar, to promote collaborative local, european and international research project, and to diffuse research results in the the different related interested domains.

  • Geo-Sci-Info

    Statistics, Information and Topology

    Co-chairs of the session:

    • Michel N'Guiffo Boyom: Université Toulouse
    • Pierre Baudot: Median (link)

    This session will focus on the advances of information theory, probability and statistics in Algebraic Topology (see [1-56] bellow). The field is currently knowing an impressive development, both on the side of the categorical, homotopical, or topos foundations of probability theorie and statistics, and of the information functions characterisation in cohomology and homotopy theory.

    Bliographicical references: (to be completed)

    [1] Cencov, N.N. Statistical Decision Rules and Optimal Inference. Translations of Mathematical Monographs. 1982.
    [2] Ay, N. and Jost, J. and Lê, H.V. and Schwachhöfer, L. Information geometry and sufficient statistics. Probability Theory and Related Fields 2015 PDF
    [3] Cathelineau, J. Sur l’homologie de sl2 a coefficients dans l’action adjointe, Math. Scand., 63, 51-86, 1988. PDF
    [4] Kontsevitch, M. The 1+1/2 logarithm. Unpublished note, Reproduced in Elbaz-Vincent & Gangl, 2002, 1995 PDF
    [5] Elbaz-Vincent, P., Gangl, H. On poly(ana)logs I., Compositio Mathematica, 130(2), 161-214. 2002. PDF
    [6] Tomasic, I., Independence, measure and pseudofinite fields. Selecta Mathematica, 12 271-306. Archiv. 2006.
    [7] Connes, A., Consani, C., Characteristic 1, entropy and the absolute point. preprint arXiv:0911.3537v1. 2009.
    [8] Marcolli, M. & Thorngren, R. Thermodynamic Semirings, arXiv 10.4171 / JNCG/159, Vol. abs/1108.2874, 2011.
    [9] Abramsky, S., Brandenburger, A., The Sheaf-theoretic structure of non-locality and contextuality, New Journal of Physics, 13 (2011). PDF
    [10] Gromov, M. In a Search for a Structure, Part 1: On Entropy, unpublished manuscript, 2013. PDF
    [11] McMullen, C.T., Entropy and the clique polynomial, 2013. PDF
    [12] Marcolli, M. & Tedeschi, R. Entropy algebras and Birkhoff factorization, arXiv, Vol. abs/1108.2874, 2014.
    [13] Doering, A., Isham, C.J., Classical and Quantum Probabilities as Truth Values, arXiv:1102.2213, 2011 PDF
    [14] Baez, J.; Fritz, T. & Leinster, T. A Characterization of Entropy in Terms of Information Loss Entropy, 13, 1945-1957, 2011. PDF
    [15] Baez J. C.; Fritz, T. A Bayesian characterization of relative entropy. Theory and Applications of Categories, Vol. 29, No. 16, p. 422-456. 2014. PDF
    [16] Drummond-Cole, G.-C., Park., J.-S., Terrila, J., Homotopy probability theory I. J. Homotopy Relat. Struct. November 2013. PDF
    [17] Drummond-Cole, G.-C., Park., J.-S., Terrila, J., Homotopy probability theory II. J. Homotopy Relat. Struct. April 2014. PDF
    [18] Burgos Gil J.I., Philippon P., Sombra M., Arithmetic geometry of toric varieties. Metrics, measures and heights, Astérisque 360. 2014 . PDF
    [19] Gromov, M. Symmetry, probability, entropy. Entropy 2015. PDF
    [20] Gromov, M. Morse Spectra, Homology Measures, Spaces of Cycles and Parametric Packing Problems, april 2015. PDF
    [21] Park., J.-S., Homotopy theory of probability spaces I: classical independence and homotopy Lie algebras. Archiv . 2015
    [22] Baudot P., Bennequin D. The homological nature of entropy. Entropy, 17, 1-66; 2015. PDF
    [23] Elbaz-Vincent, P., Gangl, H., Finite polylogarithms, their multiple analogues and the Shannon entropy. (2015) Vol. 9389 Lecture Notes in Computer Science. 277-285, Archiv.
    [24] M. Marcolli, Information algebras and their applications. International Conference on Geometric Science of Information (2015), 271-276
    [25] M. Nguiffo Boyom, Foliations-Webs-Hessian Geometry-Information Geometry-Entropy and Cohomology. Entropy 18(12): 433 (2016) PDF
    [26] M. Nguiffo Boyom, A. Zeglaoui, Amari Functors and Dynamics in Gauge Structures. GSI 2017: 170-178
    [27] G.-C. Drummond-Cole, Terila, Homotopy probability theory on a Riemannian manifold and the Euler equation , New York Journal of Mathematics, Volume 23 (2017) 1065-1085. PDF
    [28] P. Forré,, JM. Mooij. Constraint-based Causal Discovery for Non-Linear Structural Causal Models with Cycles and Latent Confounders. In A. Globerson, & R. Silva (Eds.) (2018), pp. 269-278)
    [29] T. Fritz and P. Perrone, Bimonoidal Structure of Probability Monads. Proceedings of MFPS 34, ENTCS, (2018). PDF
    [30] Jae-Suk Park, Homotopical Computations in Quantum Fields Theory, (2018) arXiv:1810.09100 PDF
    [31] G.C. Drummond-Cole, An operadic approach to operator-valued free cumulants. Higher Structures (2018) 2, 42–56. PDF
    [32] G.C. Drummond-Cole, A non-crossing word cooperad for free homotopy probability theory. MATRIX Book (2018) Series 1, 77–99. PDF
    [33] T. Fritz and P. Perrone, A Probability Monad as the Colimit of Spaces of Finite Samples, Theory and Applications of Categories 34, 2019. PDF.
    [34] M. Esfahanian, A new quantum probability theory, quantum information functor and quantum gravity. (2019) PDF
    [35] T. Leinster, Entropy modulo a prime, (2019) arXiv:1903.06961 PDF
    [36] T. Leinster, E. Roff, The maximum entropy of a metric space, (2019) arXiv:1908.11184 PDF
    [37] T. Maniero, Homological Tools for the Quantum Mechanic. arXiv 2019, arXiv:1901.02011. PDF
    [38] M. Marcolli, Motivic information, Bollettino dell'Unione Matematica Italiana (2019) 12 (1-2), 19-41
    [39] J.P. Vigneaux, Information theory with finite vector spaces, in IEEE Transactions on Information Theory, vol. 65, no. 9, pp. 5674-5687, Sept. (2019)
    [40] Baudot P., Tapia M., Bennequin, D. , Goaillard J.M., Topological Information Data Analysis. (2019), Entropy, 21(9), 869
    [41] Baudot P., The Poincaré-Shannon Machine: Statistical Physics and Machine Learning aspects of Information Cohomology. (2019), Entropy , 21(9),
    [42] G. Sergeant-Perthuis, Bayesian/Graphoid intersection property for factorisation models, (2019), arXiv:1903.06026
    [43] J.P. Vigneaux, Topology of Statistical Systems: A Cohomological Approach to Information Theory, PhD Thesis (2019).
    [44] Forré, P., & Mooij, J. M. (2019). Causal Calculus in the Presence of Cycles, Latent Confounders and Selection Bias. In A. Globerson, & R. Silva (Eds.), Proceedings of the Thirty-Fifth Conference on Uncertainty in Artificial Intelligence: UAI 2019, (2019)
    [45] Y. Manin, M. Marcolli Homotopy Theoretic and Categorical Models of Neural Information Networks. arXiv (2020) preprint arXiv:2006.15136
    [46] T. Leinster The categorical origins of Lebesgue integration (2020) arXiv:2011.00412 PDF
    [47] T. Fritz, T. Gonda, P. Perrone, E. Fjeldgren Rischel, Representable Markov Categories and Comparison of Statistical Experiments in Categorical Probability. (2020) arXiv:2010.07416 PDF
    [48] T. Fritz, E. Fjeldgren Rischel, Infinite products and zero-one laws in categorical probability (2020) arXiv:1912.02769 PDF
    [49] T. Fritz, A synthetic approach to Markov kernels, conditional independence and theorems on sufficient statistics (2020) arXiv:1908.07021 PDF
    [50] T. Fritz and P. Perrone, Stochastic Order on Metric Spaces and the Ordered Kantorovich Monad, Advances in Mathematics 366, 2020. PDF
    [51] T. Fritz and P. Perrone, Monads, partial evaluations, and rewriting. Proceedings of MFPS 36, ENTCS, 2020. PDF.
    [52] D. Bennequin. G. Sergeant-Perthuis, O. Peltre, and J.P. Vigneaux, Extra-fine sheaves and interaction decompositions, (2020) arXiv:2009.12646
    [53] J.P. Vigneaux, Information structures and their cohomology, in Theory and Applications of Categories, Vol. 35, (2020), No. 38, pp 1476-1529.
    [54] O. Peltre, Message-Passing Algorithms and Homology, PhD Thesis (2020), arXiv:2009.11631
    [55] G. Sergeant-Perthuis, Interaction decomposition for presheafs, (2020) arXiv:2008.09029
    [56] K. Hess, Topological adventures in neuroscience, in the Proceedings of the 2018 Abel Symposium: Topological Data Analysis, Springer Verlag, (2020).
    [57] C. Curto, N. Youngs. Neural ring homomorphisms and maps between neural codes. Submitted. preprint.
    [58] N.C. Combe, Y, Manin, F-manifolds and geometry of information, arXiv:2004.08808v.2, (2020) Bull. London MS.

    posted in Sessions GSI2021 read more
  • Geo-Sci-Info

    Topology and geometry in neuroscience

    chairs of the sessions

    • Giovani Petri: ISI Foundation link
    • Pierre Baudot: Median link

    This session will focus on the advances on Algebraic Topology and Geometrical methods in neurosciences (see [1-105] bellow, among many others). The field is currently knowing an impressive development coming both:

    _ from theoretical neuroscience and machine learning fields, like Graph Neural Networks [30-42], Bayesian geometrical inference [27-29], Message Passing, probability and cohomology [92-95], Information Topology [53-54,62-66,96-105] or Networks [83-85,90-91], higher order n-body statistical interactions [67,74,94-95,99,101]

    _ from topological data analysis applications to real neural recordings, ranging from subcellular [43,51] genetic or omic expressions [81,101], spiking dynamic and neural coding [1-25,45-47,50-52,79], to cortical areas fMRI, EEG [26,67-72,76-80,84-89], linguistic [54-61] and consciousness [48,53,102].

    Bibliographical references: (to be completed)

    Carina Curto, Nora Youngs and Vladimir Itskov and colleagues:

    [1] C. Curto, N. Youngs. Neural ring homomorphisms and maps between neural codes. Submitted. preprint.
    [2] C. Curto, J. Geneson, K. Morrison. Fixed points of competitive threshold-linear networks. Neural Computation, in press, 2019. preprint.
    [3] C. Curto, A. Veliz-Cuba, N. Youngs. Analysis of combinatorial neural codes: an algebraic approach. Book chapter in Algebraic and Combinatorial Computational Biology. R. Robeva, M. Macaulay (Eds), 2018.
    [4] C. Curto, V. Itskov. Combinatorial neural codes. Handbook of Discrete and Combinatorial Mathematics, Second Edition, edited by Kenneth H. Rosen, CRC Press, 2018. pdf
    [5] C. Curto, E. Gross, J. Jeffries, K. Morrison, M. Omar, Z. Rosen, A. Shiu, N. Youngs. What makes a neural code convex? SIAM J. Appl. Algebra Geometry, vol. 1, pp. 222-238, 2017. pdf, SIAGA link, and preprint
    [6] C. Curto. What can topology tells us about the neural code? Bulletin of the AMS, vol. 54, no. 1, pp. 63-78, 2017. pdf, Bulletin link.
    [7] C. Curto, K. Morrison. Pattern completion in symmetric threshold-linear networks. Neural Computation, Vol 28, pp. 2825-2852, 2016. pdf, preprint.
    [8] C. Giusti, E. Pastalkova, C. Curto, V. Itskov. Clique topology reveals intrinsic geometric structure in neural correlations. PNAS, vol. 112, no. 44, pp. 13455-13460, 2015. pdf, PNAS link.
    [9] C. Curto, A. Degeratu, V. Itskov. Encoding binary neural codes in networks of threshold-linear neurons. Neural Computation, Vol 25, pp. 2858-2903, 2013. pdf, preprint.
    [10] K. Morrison, C. Curto. Predicting neural network dynamics via graphical analysis. Book chapter in Algebraic and Combinatorial Computational Biology. R. Robeva, M. Macaulay (Eds), 2018. preprint,
    [11] C. Curto, V. Itskov, A. Veliz-Cuba, N. Youngs. The neural ring: an algebraic tool for analyzing the intrinsic structure of neural codes. Bulletin of Mathematical Biology, Volume 75, Issue 9, pp. 1571-1611, 2013. preprint.
    [12] C. Curto, V. Itskov, K. Morrison, Z. Roth, J.L. Walker. Combinatorial neural codes from a mathematical coding theory perspective. Neural Computation, Vol 25(7):1891-1925, 2013. preprint.
    [13] C. Curto, A. Degeratu, V. Itskov. Flexible memory networks. Bulletin of Mathematical Biology, Vol 74(3):590-614, 2012. preprint.
    [14] V. Itskov, C. Curto, E. Pastalkova, G. Buzsaki. Cell assembly sequences arising from spike threshold adaptation keep track of time in the hippocampus. Journal of Neuroscience, Vol. 31(8):2828-2834, 2011.
    [15] K.D. Harris, P. Bartho, et al.. How do neurons work together? Lessons from auditory cortex. Hearing Research, Vol. 271(1-2), 2011, pp. 37-53.
    [16] P. Bartho, C. Curto, A. Luczak, S. Marguet, K.D. Harris. Population coding of tone stimuli in auditory cortex: dynamic rate vector analysis. European Journal of Neuroscience, Vol. 30(9), 2009, pp. 1767-1778.
    [17] C. Curto, V. Itskov. Cell groups reveal structure of stimulus space. PLoS Computational Biology, Vol. 4(10): e1000205, 2008.
    [18] E. Gross , N. K. Obatake , N. Youngs, Neural ideals and stimulus space visualization, Adv. Appl.Math., 95 (2018), pp. 65–95.
    [19] C. Giusti, V. Itskov. A no-go theorem for one-layer feedforward networks. Neural Computation, 26 (11):2527-2540, 2014.
    [20] V. Itskov, L.F. Abbott. Capacity of a Perceptron for Sparse Discrimination . Phys. Rev. Lett. 101(1), 2008.
    [21] V. Itskov, E. Pastalkova, K. Mizuseki, G. Buzsaki, K.D. Harris. Theta-mediated dynamics of spatial information in hippocampus. Journal of Neuroscience, 28(23), 2008.
    [22] V. Itskov, C. Curto, K.D. Harris. Valuations for spike train prediction. Neural Computation, 20(3), 644-667, 2008.
    [23] E. Pastalkova, V. Itskov , A. Amarasingham , G. Buzsaki. Internally Generated Cell Assembly Sequences in the Rat Hippocampus. Science 321(5894):1322 - 1327, 2008.
    [24] V. Itskov, A. Kunin, Z. Rosen. Hyperplane neural codes and the polar complex. To appear in the Abel Symposia proceedings, Vol. 15, 2019.

    Alexander Ruys de Perez and colleagues:

    [25] A. Ruys de Perez, L.F. Matusevich, A. Shiu, Neural codes and the factor complex, Advances in Applied Mathematics 114 (2020).

    Sunghyon Kyeong and colleagues:

    [26] Sunghyon Kyeong, Seonjeong Park, Keun-Ah Cheon, Jae-Jin Kim, Dong-Ho Song, and Eunjoo Kim, A New Approach to Investigate the Association between Brain Functional Connectivity and Disease Characteristics of Attention-Deficit/Hyperactivity Disorder: Topological Neuroimaging Data Analysis, PLOS ONE, 10 (9): e0137296, DOI: 10.1371/journal.pone.0137296 (2015)

    Jonathan Pillow and colleagues:

    [27] Aoi MC & Pillow JW (2017). Scalable Bayesian inference for high-dimensional neural receptive fields. bioRxiv 212217; doi:
    [28] Aoi MC, Mante V, & Pillow JW. (2020). Prefrontal cortex exhibits multi-dimensional dynamic encoding during decision-making. Nat Neurosci.
    [29] Calhoun AJ, Pillow JW, & Murthy M. (2019). Unsupervised identification of the internal states that shape natural behavior. Nature Neuroscience 22:2040-20149.
    [30] Dong X, Thanou D, Toni L, et al., 2020, Graph Signal Processing for Machine Learning: A Review and New Perspectives, Ieee Signal Processing Magazine, Vol:37, ISSN:1053-5888, Pages:117-127

    Michael Bronstein, Federico Monti, Giorgos Bouritsas and colleagues:

    [31] G. Bouritsas, F. Frasca, S Zafeiriou, MM Bronstein, Improving graph neural network expressivity via subgraph isomorphism counting. arXiv (2020) preprint arXiv:2006.09252
    [32] M. Bronstein , G. Pennycook, L. Buonomano, T.D. Cannon, Belief in fake news, responsiveness to cognitive conflict, and analytic reasoning engagement, Thinking and Reasoning (2020), ISSN: 1354-6783
    [33] X. Dong, D. Thanou, L. Toni, M. Bronstein, P. Frossard, Graph Signal Processing for Machine Learning: A Review and New Perspectives, IEEE Signal Processing Magazine (2020), Vol: 37, Pages: 117-127, ISSN: 1053-5888
    [34] Y. Wang, Y. Sun, Z. Liu, S.E. Sarma, M. Bronstein, J.M. Solomon, Dynamic Graph CNN for Learning on Point Clouds, ACM Transactions on graphics (2020), Vol: 38, ISSN: 0730-0301
    [35] M. Bronstein, J. Everaert, A. Castro, J. Joormann, T. D. Cannon, Pathways to paranoia: Analytic thinking and belief flexibility., Behav Res Ther (2019), Vol: 113, Pages: 18-24
    [36] G. Bouritsas, S. Bokhnyak, S. Ploumpis, M. Bronstein, S. Zafeiriou, Neural 3D Morphable Models: Spiral Convolutional Networks for 3D Shape Representation Learning and Generation, (2019) IEEE/CVF ICCV 2019, 7212
    [37] O. Litany, A. Bronstein, M. Bronstein, A. Makadia et al., Deformable Shape Completion with Graph Convolutional Autoencoders (2018), Pages: 1886-1895, ISSN: 1063-6919
    [38] R. Levie, F. Monti, X. Bresson X, M. Bronstein, CayleyNets: Graph Convolutional Neural Networks with Complex Rational Spectral Filters, IEEE Transactions on Signal Processing (2018), Vol: 67, Pages: 97-109, ISSN: 1053-587X
    [39] F. Monti, K. Otness, M. Bronstein, Motifnet: a motif-based graph convolutional network for directed graphs (2018), Pages: 225-228
    [40] F. Monti, M. Bronstein, X. Bresson, Geometric matrix completion with recurrent multi-graph neural networks, Neural Information Processing Systems (2017), Pages: 3700-3710, ISSN: 1049-5258
    [41] F. Monti F, D. Boscaini, J. Masci, E. Rodola, J. Svoboda, M. Bronstein, Geometric deep learning on graphs and manifolds using mixture model CNNs, (2017) IEEE Conference on Computer Vision and Pattern Recognition, p: 3-3
    [42] M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, P. Vandergheynst et al., Geometric Deep Learning Going beyond Euclidean data, IEEE Signal Processing Magazine (2017), Vol: 34, Pages: 18-42, ISSN: 1053-5888

    Kathryn Hess and colleagues:

    [43] L. Kanari, H. Dictus, W. Van Geit, A. Chalimourda, B. Coste, J. Shillcock, K. Hess, and H. Markram, Computational synthesis of cortical dendritic morphologies, bioRvix (2020) 10.1101/2020.04.15.040410, submitted.
    [44] G. Tauzin, U. Lupo, L. Tunstall, J. Burella Prez, M. Caorsi, A. Medina-Mardones, A, Dassatti, and K. Hess, giotto-tda: a topological data analysis toolkit for machine learning and data exploration, arXiv:2004.02551
    [45] E. Mullier, J. Vohryzek, A. Griffa, Y. Alemàn-Gómez, C. Hacker, K. Hess, and P. Hagmann, Functional brain dynamics are shaped by connectome n-simplicial organization, (2020) submitted.
    [46] M. Fournier, M. Scolamiero, etal., Topology predicts long-term functional outcome in early psychosis, Molecular Psychiatry (2020).
    [47] K. Hess, Topological adventures in neuroscience, in the Proceedings of the 2018 Abel Symposium: Topological Data Analysis, Springer Verlag, (2020).
    [48] A. Doerig, A. Schurger, K. Hess, and M. H. Herzog, The unfolding argument: why IIT and other causal structure theories of consciousness are empirically untestable, Consciousness and Cognition 72 (2019) 49-59.
    [49] L. Kanari, S. Ramaswamy, et al., Objective classification of neocortical pyramidal cells, Cerebral Cortex (2019) bhy339,
    [50] J.-B. Bardin, G. Spreemann, K. Hess, Topological exploration of artificial neuronal network dynamics, Network Neuroscience (2019)
    [51] L. Kanari, P. Dłotko, M. Scolamiero, R. Levi, J. C. Shillcock, K. Hess, and H. Markram, A topological representation of branching morphologies, Neuroinformatics (2017) doi: 10.1007/s12021-017-9341-1.
    [52] M. W. Reimann, M. Nolte,et al., Cliques of neurons bound into cavities provide a missing link between structure and function, Front. Comput. Neurosci., 12 June (2017), doi: 10.3389/fncom.2017.00048.

    Mathilde Marcoli, Yuri Manin, and colleagues:

    [53] Y. Manin, M. Marcolli Homotopy Theoretic and Categorical Models of Neural Information Networks. arXiv (2020) preprint arXiv:2006.15136
    [54] M. Marcolli, Lumen Naturae: Visions of the Abstract in Art and Mathematics, MIT Press (2020)
    [55] A. Port, T. Karidi, M. Marcolli, Topological Analysis of Syntactic Structures (2019) arXiv preprint arXiv:1903.05181
    [56] M. Marcolli, Motivic information, Bollettino dell'Unione Matematica Italiana (2019) 12 (1-2), 19-41
    [57] A. Port, I. Gheorghita, D. Guth, J.M. Clark, C. Liang, S. Dasu, M. Marcolli, Persistent topology of syntax, Mathematics in Computer Science (2018) 12 (1), 33-50 20
    [58] K. Shu, S. Aziz, VL Huynh, D Warrick, M Marcolli, Syntactic phylogenetic trees, Foundations of Mathematics and Physics One Century After Hilbert (2018), 417-441
    [59] K. Shu, A. Ortegaray, R Berwick, M Marcolli Phylogenetics of Indo-European language families via an algebro-geometric analysis of their syntactic structures. arXiv (2018) preprint arXiv:1712.01719
    [60] K. Shu, M. Marcolli, Syntactic structures and code parameters Mathematics in Computer Science (2018) 11 (1), 79-90
    [61] K Siva, J Tao, M Marcolli. Syntactic Parameters and Spin Glass Models of Language Change Linguist. Anal (2017) 41 (3-4), 559-608
    [62] M. Marcolli, N. Tedeschi, Entropy algebras and Birkhoff factorization. Journal of Geometry and Physics (2015) 97, 243-265
    [63] M. Marcolli, Information algebras and their applications. International Conference on Geometric Science of Information (2015), 271-276
    [64] K. Siva, J. Tao, M. Marcolli Spin glass models of syntax and language evolution, arXiv preprint (2015) arXiv:1508.00504
    [65] Y. Manin, M. Marcolli, Kolmogorov complexity and the asymptotic bound for error-correcting codes Journal of Differential Geometry (2014) 97 (1), 91-108
    [66] M. Marcolli, R. Thorngren, Thermodynamic semirings, ArXiv preprint (2011) arXiv:1108.2874

    Bosa Tadić and colleagues:

    [67] M. Andjelkovic, B. Tadic, R. Melnik, The topology of higher-order complexes associated with brain-function hubs in human connectomes , available on, published in Scientific Reports 10:17320 (2020)
    [68] B. Tadic, M. Andjelkovic, M. Suvakov, G.J. Rodgers, Magnetisation Processes in Geometrically Frustrated Spin Networks with Self-Assembled Cliques, Entropy 22(3), 336 (2020)
    [69] B. Tadic, M. Andjelkovic, R. Melnik, Functional Geometry of Human Connectomes published in ScientificReports Nature:ScientificReports 9:12060 (2019) previous version: Functional Geometry of Human Connectome and Robustness of Gender Differences, arXiv preprint arXiv:1904.03399 April 6, 2019
    [70] B. Tadic, M. Andjelkovic, M. Suvakov, Origin of hyperbolicity in brain-to-brain coordination networks, FRONTIERS in PHYSICS vol.6, ARTICLE{10.3389/fphy.2018.00007}, (2018) OA
    [71] B. Tadic, M. Andjelkovic, Algebraic topology of multi-brain graphs: Methods to study the social impact and other factors onto functional brain connections, in Proceedings of BELBI (2016)
    [72] B. Tadic, M. Andjelkovic, B.M. Boskoska, Z. Levnajic, Algebraic Topology of Multi-Brain Connectivity Networks Reveals Dissimilarity in Functional Patterns during Spoken Communications, PLOS ONE Vol 11(11), e0166787 (2016)
    [73] M. Mitrovic and B. Tadic, Search for Weighted Subgraphs on Complex Networks with MLM, Lecture Notes in Computer Science, Vol. 5102 pp. 551-558 (2008)

    Giovanni Petri, Francesco Vaccarino and collaborators:

    [74] F. Battiston, G. Cencetti, et al., Networks beyond pairwise interactions: structure and dynamics, Physics Reports (2020), arXiv:2006.01764
    [75] M. Guerra, A. De Gregorio, U. Fugacci, G. Petri, F. Vaccarino, Homological scaffold via minimal homology bases. arXiv (2020) preprint arXiv:2004.11606
    [76] J. Billings, R. Tivadar, M.M. Murray, B. Franceschiello, G. Petri, Topological Features of Electroencephalography are Reference-Invariant, bioRxiv 2020
    [77] J. Billings, M. Saggar, S. Keilholz, G. Petri, Topological Segmentation of Time-Varying Functional Connectivity Highlights the Role of Preferred Cortical Circuits, bioRxiv 2020
    [78] E. Ibáñez-Marcelo, L. Campioni, et al., Topology highlights mesoscopic functional equivalence between imagery and perception: The case of hypnotizability. NeuroImage (2019) 200, 437-449
    [79] P. Expert, L.D. Lord, M.L. Kringelbach, G. Petri. Topological neuroscience. Network Neuroscience (2019) 3 (3), 653-655
    [80] C. Geniesse, O. Sporns, G. Petri, M. Saggar, Generating dynamical neuroimaging spatiotemporal representations (DyNeuSR) using topological data analysis. Network Neuroscience (2019) 3 (3), 763-778
    [81] A. Patania, P. Selvaggi, M. Veronese, O. Dipasquale, P. Expert, G. Petri, Topological gene expression networks recapitulate brain anatomy and function. Network Neuroscience (2019) 3 (3), 744-762
    [82] E. Ibáñez‐Marcelo, L. Campioni, al.. Spectral and topological analyses of the cortical representation of the head position: Does hypnotizability matter? Brain and behavior (2018) 9 (6), e01277
    [83] G. Petri, A. Barrat, Simplicial activity driven model, Physical review letters 121 (22), 228301
    [84] A. Phinyomark, E. Ibanez-Marcelo, G. Petri. Resting-state fmri functional connectivity: Big data preprocessing pipelines and topological data analysis. IEEE Transactions on Big Data (2017) 3 (4), 415-428
    [85] G. Petri, S. Musslick, B. Dey, K. Ozcimder, D. Turner, N.K. Ahmed, T. Willke. Topological limits to parallel processing capability of network architectures. arXiv preprint (2017) arXiv:1708.03263
    [86] K. Ozcimder, B. Dey, S. Musslick, G. Petri, N.K. Ahmed, T.L. Willke, J.D. Cohen, A Formal Approach to Modeling the Cost of Cognitive Control, arXiv preprint (2017) arXiv:1706.00085
    [87] L.D. Lord, P. Expert, et al. , Insights into brain architectures from the homological scaffolds of functional connectivity networks, Frontiers in systems neuroscience (2016) 10, 85
    [88] J. Binchi, E. Merelli, M. Rucco, G. Petri, F. Vaccarino. jHoles: A Tool for Understanding Biological Complex Networks via Clique Weight Rank Persistent Homology. Electron. Notes Theor. Comput. Sci. (2014) 306, 5-18
    [89] G. Petri, P. Expert, F. Turkheimer, R. Carhart-Harris, D. Nutt, P.J. Hellyer et al., Homological scaffolds of brain functional networks. Journal of The Royal Society Interface (2014) 11 (101), 20140873
    [90] G. Petri, M. Scolamiero, I. Donato, F. Vaccarino, Topological strata of weighted complex networks. PloS one (2013) 8 (6), e66506
    [91] G. Petri, M. Scolamiero, I. Donato, ., Networks and cycles: a persistent homology approach to complex networks Proceedings of the european conference on complex systems (2013), 93-99

    Daniel Bennequin, Juan-Pablo Vigneaux, Olivier Peltre, Pierre Baudot and colleagues:

    [92] D. Bennequin. G. Sergeant-Perthuis, O. Peltre, and J.P. Vigneaux, Extra-fine sheaves and interaction decompositions, (2020) arXiv:2009.12646
    [93] O. Peltre, Message-Passing Algorithms and Homology, PhD Thesis (2020), arXiv:2009.11631
    [94] G. Sergeant-Perthuis, Interaction decomposition for presheafs, (2020) arXiv:2008.09029
    [95] G. Sergeant-Perthuis, Bayesian/Graphoid intersection property for factorisation models, (2019), arXiv:1903.06026
    [96] J.P. Vigneaux, Topology of Statistical Systems: A Cohomological Approach to Information Theory, PhD Thesis (2019).
    [97] J.P. Vigneaux, Information structures and their cohomology, in Theory and Applications of Categories, Vol. 35, (2020), No. 38, pp 1476-1529.
    [98] J.P. Vigneaux, Information theory with finite vector spaces, in IEEE Transactions on Information Theory, vol. 65, no. 9, pp. 5674-5687, Sept. (2019)
    [99] Baudot P., Tapia M., Bennequin, D. , Goaillard J.M., Topological Information Data Analysis. (2019), Entropy, 21(9), 869
    [100] Baudot P., The Poincaré-Shannon Machine: Statistical Physics and Machine Learning aspects of Information Cohomology. (2019), Entropy , 21(9),
    [101] Tapia M., Baudot P., et al. Neurotransmitter identity and electrophysiological phenotype are genetically coupled in midbrain dopaminergic neurons. Scientific Reports. (2018). BioArXiv168740
    [102] Baudot P., Elements of qualitative cognition: an Information Topology Perspective. Physics of Life Reviews. (2019) Arxiv. arXiv:1807.04520
    [103] Baudot P., Bennequin D., The homological nature of entropy. Entropy, (2015), 17, 1-66; doi:10.3390
    [104] D. Bennequin. Remarks on Invariance in the Primary Visual Systems of Mammals, pages 243–333. Neuromathematics of Vision Part of the series Lecture Notes in Morphogenesis Springer, 2014.
    [105] Baudot P., Bennequin D., Information Topology I and II. Random models in Neuroscience (2012)

    posted in Sessions GSI2021 read more
  • Geo-Sci-Info

    Informatics Institute, University of Amsterdam and Qualcomm Technologies
    ELLIS Board Member (European Laboratory for Learning and Intelligent Systems:

    Title: Exploring Quantum Statistics for Machine Learning

    Abstract: Quantum mechanics represents a rather bizarre theory of statistics that is very different from the ordinary classical statistics that we are used to. In this talk I will explore if there are ways that we can leverage this theory in developing new machine learning tools: can we design better neural networks by thinking about entangled variables? Can we come up with better samplers by viewing them as observations in a quantum system? Can we generalize probability distributions? We hope to develop better algorithms that can be simulated efficiently on classical computers, but we will naturally also consider the possibility of much faster implementations on future quantum computers. Finally, I hope to discuss the role of symmetries in quantum theories.

    Roberto Bondesan, Max Welling, Quantum Deformed Neural Networks, arXiv:2010.11189v1 [quant-ph], 21st October 2020 ;

    posted in GSI2021 read more
  • Geo-Sci-Info


    Welcome to “Geometric Science of Information” 202 Conference

    On behalf of both the organizing and the scientific committees, it is our great pleasure to welcome all delegates, representatives and participants from around the world to the fifth International SEE conference on “Geometric Science of Information” (GSI’21), sheduled in July 2021.

    GSI’21 benefits from scientific sponsor and financial sponsors.

    The 3-day conference is also organized in the frame of the relations set up between SEE and scientific institutions or academic laboratories such as Ecole Polytechnique, Ecole des Mines ParisTech, INRIA, CentraleSupélec, Institut Mathématique de Bordeaux, Sony Computer Science Laboratories.

    The GSI conference cycle has been initiated by the Brillouin Seminar Team as soon as 2009. The GSI’21 event has been motivated in the continuity of first initiatives launched in 2013 ( at Mines PatisTech, consolidated in 2015 ( at Ecole Polytechnique and opened to new communities in 2017 ( at Mines ParisTech and 2019 ( at ENAC Toulouse. We mention that in 2011, we organized an indo-french workshop on “Matrix Information Geometry” that yielded an edited book in 2013, and in 2017, collaborate to CIRM seminar in Luminy TGSI’17 “Topoplogical & Geometrical Structures of Information” ( Last GSI’19 Proceedings have been edited by SPRINGER in Lecture Notes (

    GSI satellites event have been organized in 2019 and 2020 as, FGSI’19 “Foundation of Geometric Science of Information” in Montpellier and Les Houches Seminar SPIGL’20 “Joint Structures and Common Foundations of Statistical Physics, Information Geometry and Inference for Learning” .

    The technical program of GSI’21 covers all the main topics and highlights in the domain of “Geometric Science of Information” including Information Geometry Manifolds of structured data/information and their advanced applications. This proceedings consists solely of original research papers that have been carefully peer-reviewed by two or three experts before, and revised before acceptance.

    Historical background

    As for the GSI’13, GSI’15, GSI’17, and GSI’19 GSI'21 addresses inter-relations between different mathematical domains like shape spaces (geometric statistics on manifolds and Lie groups, deformations in shape space, ...), probability/optimization & algorithms on manifolds (structured matrix manifold, structured data/Information, ...), relational and discrete metric spaces (graph metrics, distance geometry, relational analysis,...), computational and Hessian information geometry, geometric structures in thermodynamics and statistical physics, algebraic/infinite dimensional/Banach information manifolds, divergence geometry, tensor-valued morphology, optimal transport theory, manifold & topology learning, ... and applications like geometries of audio-processing, inverse problems and signal/image processing. GSI’21 topics were enriched with contributions from Lie Group Machine Learning, Harmonic Analysis on Lie Groups, Geometric Deep Learning, Geometry of Hamiltonian Monte Carlo, Geometric & (Poly)Symplectic Integrators, Contact Geometry & Hamiltonian Control, Geometric and structure preserving discretizations, Probability Density Estimation & Sampling in High Dimension, Geometry of Graphs and Networks and Geometry in Neuroscience & Cognitive Sciences.

    At the turn of the century, new and fruitful interactions were discovered between several branches of science: Information Science (information theory, digital communications, statistical signal processing,), Mathematics (group theory, geometry and topology, probability, statistics, sheaves theory,...) and Physics (geometric mechanics, thermodynamics, statistical physics, quantum mechanics, ...). GSI conference cycle is a tentative to discover joint mathematical structures to all these disciplines by elaboration of a “General Theory of Information” embracing physics science, information science, and cognitive science in a global scheme.

    Frank Nielsen, co-chair : Ecole Polytechnique, Palaiseau, France, Sony Computer Science Laboratories, Tokyo, Japan

    Capture du 2020-11-29 13-17-08.png

    Frédéric Barbaresco, co-chair: President of SEE ISIC Club (Ingénierie des Systèmes d'Information et de Communications),
    Representative of KTD PCC (Key Technology Domain / Processing, Computing & Cognition ) Board, THALES LAND & AIR SYSTEMS, France
    Capture du 2020-11-29 13-15-07.png

    Capture du 2020-11-29 13-00-49.png

    posted in GSI2021 read more
  • Geo-Sci-Info



    As for GSI’13, GSI’15, GSI’17 and GSI’19, the objective of this SEE GSI’21 conference, hosted in PARIS, is to bring together pure/applied mathematicians and engineers, with common interest for Geometric tools and their applications for Information analysis.
    It emphasizes an active participation of young researchers to discuss emerging areas of collaborative research on “Geometric Science of Information and their Applications”.
    Current and ongoing uses of Information Geometry Manifolds in applied mathematics are the following: Advanced Signal/Image/Video Processing, Complex Data Modeling and Analysis, Information Ranking and Retrieval, Coding, Cognitive Systems, Optimal Control, Statistics on Manifolds, Topology/Machine/Deep Learning, Artificial Intelligence, Speech/sound recognition, natural language treatment, Big Data Analytics, Learning for Robotics, etc., which are substantially relevant for industry.
    The Conference will be therefore held in areas of topics of mutual interest with the aim to:
    • Provide an overview on the most recent state-of-the-art
    • Exchange mathematical information/knowledge/expertise in the area
    • Identify research areas/applications for future collaboration

    Provisional topics of interests:

    • Geometric Deep Learning (ELLIS session)
    • Probability on Riemannian Manifolds
    • Optimization on Manifold
    • Shape Space & Statistics on non-linear data
    • Lie Group Machine Learning
    • Harmonic Analysis on Lie Groups
    • Statistical Manifold & Hessian Information Geometry
    • Monotone Embedding in Information Geometry
    • Non-parametric Information Geometry
    • Computational Information Geometry
    • Distance and Divergence Geometries
    • Divergence Statistics
    • Optimal Transport & Learning
    • Geometry of Hamiltonian Monte Carlo
    • Statistics, Information & Topology
    • Graph Hyperbolic Embedding & Learning
    • Inverse problems: Bayesian and Machine Learning interaction
    • Integrable Systems & Information Geometry
    • Geometric structures in thermodynamics and statistical physics
    • Contact Geometry & Hamiltonian Control
    • Geometric and structure preserving discretizations
    • Geometric & Symplectic Methods for Quantum Systems
    • Geometry of Quantum States
    • Geodesic Methods with Constraints
    • Probability Density Estimation & Sampling in High Dimension
    • Geometry of Tensor-Valued Data
    • Geometric Mechanics
    • Geometric Robotics & Learning
    • Topological and geometrical structures in neurosciences

    A special session will deal with:

    • Geometric Structures Coding & Learning Libraries (geomstats, pyRiemann , Pot…)

    Advanced information on article submission and publication
    As for previous editions, GSI’21 Proceedings will be published in SPRINGER LNCS. See GSI’19 Proceedings
    8 pages SPRINGER LNCS format is required for Initial paper submission.
    A detailed call for contributions will be published shortly.

    Capture du 2020-11-29 13-00-49.png

    posted in GSI2021 read more
  • Geo-Sci-Info

    Capture du 2020-11-25 22-15-11.png


    Ph.D. and Postdoc positions in Applied Mathematics

    Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany

    Application deadline: January 10th, 2021

    The Group

    The Chair of Applied Analysis – Alexander von Humboldt Professorship at the Department of Mathematics of the Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU), in Erlangen (Germany), led by Prof. Dr. Enrique Zuazua, is looking for outstanding candidates to fill several

    Ph.D. and Postdoctoral positions

    In the broad area of Applied Mathematics, the Chair develops and applies methods of Analysis, Computational Mathematics and Data Sciences to model, understand and control the dynamics of various phenomena arising in the interphase of Mathematics with Engineering, Physics, Biology and Social Sciences.

    We welcome applications by young and highly motivated scientists to contribute to this exciting joint AvH-FAU effort. Possible research projects include but are not limited to:

    • Analysis of Partial Differential Equations (PDE).
    • The interplay between Data Sciences, numerics of PDE and Control Systems.
    • Control of diffusion models arising in Biology and Social Sciences.
    • Modelling and control of multi-agent systems.
    • Hyperbolic models arising in traffic flow and energy transport.
    • Waves in networks and Markov chains.
    • Fractional PDE.
    • Optimal design in Material Sciences.
    • Micro-macro limit processes.
    • The interplay between discrete and continuous modelling in design and control.
    • The emergence of turnpike phenomena in long-time horizons.
    • Inversion and parameter identification.
    • Recommendation systems.
    • Development of new computation tools and software.

    We look for excellent candidates with expertise in the areas of applied mathematics, PDE analysis, control theory, numerical analysis, data sciences and computational mathematics who enjoy interdisciplinary work.

    The Chair contributes to the development of a new Center of Research at FAU, in the broad area of “Mathematics of Data”, conceived as a highly visible interdisciplinary research site, an incubator for future collaborative research grants and a turntable for the key research priorities of FAU. The recruited candidates will have the added opportunity to participate in this challenging endeavour.

    How to apply

    Applications, including cover/motivation letter, curriculum vitae, list of publications, statement of research and two or three names of experts for reference should be submitted via e-mail as a single pdf file to secretary-aa[at] before January 10th, 2012.

    Any inquiries about the positions should be sent to Prof. Enrique Zuazua (positions-aa[at] Applications will be accepted until the positions are filled.

    FAU is a member of “The Family in Higher Education Institutions” best practice club and also aims to increase the number of women in scientific positions. Female candidates are therefore particularly encouraged to apply. In case of equal qualifications, candidates with disabilities will take precedence.

    For more detailed information about the Chair, please visit Chair of Applied Analysis – Alexander von Humboldt Professorship

    posted in Jobs offers - Call for projects read more
  • Geo-Sci-Info

    GeomLoss : Geometric Loss functions between sampled measures, images and volumes

    Find all the docs and tutorials of the version 0.2.3 in the read the docs website:

    N.B.: This is still an alpha release! Please send me your feedback: I will polish the user interface, implement Hausdorff divergences, add support for meshes, images, volumes and clean the documentation over the summer of 2020.

    The GeomLoss library provides efficient GPU implementations for:

    • Kernel norms (also known as Maximum Mean Discrepancies).

    • Hausdorff divergences, which are positive definite generalizations of the ICP loss, analogous to log-likelihoods of Gaussian Mixture Models.

    • Unbiased Sinkhorn divergences, which are cheap yet positive definite approximations of Optimal Transport (Wasserstein) costs.

    These loss functions, defined between positive measures, are available through the custom PyTorch layers SamplesLoss, ImagesLoss and VolumesLoss which allow you to work with weighted point clouds (of any dimension), density maps and volumetric segmentation masks. Geometric losses come with three backends each:

    • A simple tensorized implementation, for small problems (< 5,000 samples).

    • A reference online implementation, with a linear (instead of quadratic) memory footprint, that can be used for finely sampled measures.

    • A very fast multiscale code, which uses an octree-like structure for large-scale problems in dimension <= 3.

    GeomLoss is a simple interface for cutting-edge Optimal Transport algorithms. It provides:

    • Support for batchwise computations.
    • Linear (instead of quadratic) memory footprint for large problems, relying on the KeOps library for map-reduce operations on the GPU.
    • Fast kernel truncation for small bandwidths, using an octree-based structure.
    • Log-domain stabilization of the Sinkhorn iterations, eliminating numeric overflows for small values of 𝜀
    • Efficient computation of the gradients, which bypasses the naive backpropagation algorithm.
    • Support for unbalanced Optimal Transport, with a softening of the marginal constraints through a maximum reach parameter.
    • Support for the ε-scaling heuristic in the Sinkhorn loop, with kernel truncation in dimensions 1, 2 and 3. On typical 3D problems, our implementation is 50-100 times faster than the standard SoftAssign/Sinkhorn algorithm.

    Note, however, that SamplesLoss does not implement the Fast Multipole or Fast Gauss transforms. If you are aware of a well-packaged implementation of these algorithms on the GPU, please contact me!

    The divergences implemented here are all symmetric, positive definite and therefore suitable for measure-fitting applications. For positive input measures 𝛼 and 𝛽, our Loss

    functions are such that
    Loss(𝛼,𝛽) = Loss(𝛽,𝛼),
    0 = Loss(𝛼,𝛼) ⩽ Loss(𝛼,𝛽),
    0 = Loss(𝛼,𝛽) ⟺ 𝛼=𝛽.

    GeomLoss can be used in a wide variety of settings, from shape analysis (LDDMM, optimal transport…) to machine learning (kernel methods, GANs…) and image processing. Details and examples are provided below:

    GeomLoss is licensed under the MIT license.

    Author and Contributors

    Feel free to contact us for any bug report or feature request:

    Related projects

    You may be interested by:

    • The KeOps library, which provides efficient CUDA routines for point cloud processing, with full PyTorch support.

    • Rémi Flamary and Nicolas Courty’s Python Optimal Transport library, which provides a reference implementation of OT-related methods for small problems.

    • Bernhard Schmitzer’s Optimal Transport toolbox, which provides a reference multiscale solver for the OT problem, on the CPU.

    posted in GSI FORGE read more
  • Geo-Sci-Info

    1-2 Fully funded (4yrs) PhD position on AI/machine learning @ UiT The Arctic University of Norway

    1-2 Fully funded (4yrs) PhD position on AI/machine learning with the Department of Computer Science, UiT The Arctic University of Norway.

    Application Link -

    Deadline - 18th October 2020
    Location- Tromsø, Norway

    These positions require a Master’s degree or equivalent in Computer Science, or Mathematics and Computing. In addition, the candidates must have:

    Experience of working with computer vision and deep learning toolkits on at least one of the following platforms – Python, C/C++, MATLAB, Keras, PyTorch, Tensor Flow

    Demonstration of programming proficiency in at least two of the following platforms: Python, C/C++, MATLAB, OpenCV, etc.

    Postgraduate coursework or master thesis strongly related to at least four of the following topics:

    • Machine learning/deep learning
    • Computer vision
    • Optimization theory/ convex optimization/computational optimization
    • Linear algebra
    • Statistics/statistical machine learning
    • Computational modelling of differential and integral equations
    • Data science
    • GPU programming
    • Neural networks
    • Distributed learning/extreme learning

    Your application must include:
    Cover letter explaining your motivation and research interests
    CV - summarizing education, positions and academic work
    Diplomas and transcripts from completed Bachelor’s and Master’s degrees
    Documentation of English proficiency
    1-3 references with contact details
    Master thesis, and any other academic works
    Documentation has to be in English or a Scandinavian language. We only accept applications through Jobbnorge.

    Remuneration -
    approx. 48,000 Euro per annum (Remuneration of the PhD position is in State salary scale code 1017. A compulsory contribution of 2% to the Norwegian Public Service Pension Fund will be deducted.)

    VirtualStain is a project funded under thematic call for strategic funding by UiT The Arctic University of Norway. It involves developing AI solutions for segmenting, identity allocation, and modeling of the processes of sub-cellular structures such as mitochondria in cells and cellular structures in tissues using label-free images and videos of cells and tissues. Interpreting life processes and label-free images of cells and tissues is a daunting task. The PhD students will work on the following problem:

    Images of unlabeled samples appear as gray scale images devoid of color, texture, and edges. Therefore, they lack features conventionally used in deep models for identification of individual structures. New suitably designed and trained intelligence models have to be developed specific to the chosen label-free imaging technology. If conventional AI approaches such as deep learning and generative networks are used, large training dataset with correlated image sets of labeled and label-free images are needed, which is a significant challenge. There is a need of new out-of-box AI solutions that derive and improve intelligence, as new data becomes available.

    Project page -

    posted in Jobs offers - Call for projects read more
  • Geo-Sci-Info

    Capture du 2020-08-11 11-18-55.png


    • Introduction and presentation of the conferences by Frederic Barbaresco. VIDEO

    • Presentation of Geometric Sciences of Information and GSI 2021 by Frederic Barbaresco. VIDEO

    LECTURES (90 min)

    1. Langevin Dynamics

    • 1.1 Langevin Dynamics: old and news : Eric Moulines . Part 1 : introduction to Markov chain Monte Carlo Methods VIDEO, Part 2 VIDEO

    2. Computational Information Geometry:

    • 2.1. Information Manifold modeled with Orlicz Spaces : Giovanni Pistone . VIDEO

    • 2.2. Recent contributions to Distances and Information Geometry: a computational viewpoint : Frank Nielsen . VIDEO - SLIDES

    3. Non-Equilibrium Thermodynamic Geometry

    • 3.1. A variational perspective of closed and open systems: François Gay-Balmaz
    • 3.2. Geometry of Non-Equilibrium Thermodynamics: a homogeneous Symplectic approach : Arjan Van Der Schaft . VIDEO- SLIDES

    4. Geometric Mechanics

    • 4.1. Galilean Mechanics and Thermodynamics of continua : Géry de Saxcé. VIDEO - SLIDES

    • 4.2. Souriau-Casimir Lie Groups Thermodynamics and Machine Learning : Frederic Barbaresco. VIDEO - SLIDES

    5. "Structure des Systèmes Dynamiques" (SSD) Jean-Marie Souriau’s book 50th Birthday Wikipedia page

    • 5.1. Souriau Familly and "structure of motion": Jean-Marie Souriau, Michel Souriau, Paul Souriau and Etienne Souriau : Frederic Barbaresco . VIDEO - SLIDES

    • 5.2. SSD Jean-Marie Souriau’s book 50th birthday: Géry de Saxcé SLIDES

    KEYNOTES (60 min)

    • Learning Physics from Data : Francisco Chinesta . VIDEO VIDEO - SLIDES

    • Information Geometry and Integrable Systems : Jean-Pierre Françoise. VIDEO VIDEO - SLIDES

    • Learning with Few Labeled Data : Pratik Chaudhari . VIDEO - SLIDES

    • Information Geometry and Quantum Fields : Kevin Grosvenor SLIDES

    • Port Thermodynamic Systems Control : Bernhard Maschke . VIDEO - SLIDES

    • Dirac Structures in Nonequilibrium Thermodynamics : Hiroaki Yoshimura . VIDEO - SLIDES

    • Thermodynamic efficiency implies predictive inference : Susanne Still . VIDEO - SLIDES

    • Computational dynamics of reduced coupled multibody-fluid system in Lie group setting : Zdravko Terze . VIDEO - SLIDES

    • Exponential Family by Representation Theory : Koichi Tojo . VIDEO - SLIDES

    • Deep Learning as Optimal Control Problems and Structure Preserving Deep Learning : Elena Celledoni . VIDEO - SLIDES

    • Contact geometry and thermodynamical systems : Manuel de León. VIDEO - SLIDES

    • Diffeological Fisher Metric : Hông Vân Lê. VIDEO - SLIDES

    • Mechanics of the probability simplex : Luigi Malagò. VIDEO - SLIDES

    • Covariant Momentum Map Thermodynamics : Goffredo Chirco. VIDEO - SLIDES

    • Sampling and statistical physics via symmetry : Steve Huntsman. VIDEO - SLIDES

    • Geometry of Measure-preserving Flows and Hamiltonian Monte Carlo : Alessandro Barp. VIDEO - SLIDES

    • Schroedinger's problem, Hamilton-Jacobi-Bellman equations and regularized Mass Transportation : Jean-Claude Zambrini. VIDEO - SLIDES


    PDF of posters:

    • Viscoelastic flows of Maxwell fluids with conservation laws - Sébastien Boyaval - POSTER
    • Bayesian Inference on Local Distributions of Functions and Multi-dimensional Curves with Spherical HMC Sampling - Anis Fradi and Chafik Samir - POSTER
    • Material modeling via Thermodynamics-based Artificial Neural Networks - Filippo Masi Ioannis Stefanou, Paolo Vannucci, Victor Maffi-Berthier - POSTER
    • LEARNING THE LOW-DIMENSIONAL GEOMETRY OF THE WIRELESS CHANNEL - Paul Ferrand, Alexis Decurninge, Luis Garcia Ordóñez and Maxime Guillaud - POSTER
    • A Hyperbolic approach for learning communities on graphs - Hatem Hajri, Thomas Gerald and Hadi Zaatiti - POSTER
    • Hard Shape-Constrained Kernel Regression - Pierre-Cyril Aubin-Frankowski and Zoltán Szabó - POSTER
    • CONSTRAINT-BASED REGULARIZATION OF NEURAL NETWORKS - Benedict Leimkuhler, Timothée Pouchon, Tiffany Vlaar, Amos Storkey - POSTER
    • Geomstats: A Python Package for Geometry in Machine Learning and Information Geometry - Nina Miolane, Nicolas Guigui1, Alice Le Brigant, Hadi Zaatiti, Christian Shewmake, Hatem Hajri, Johan Mathe, Benjamin Hou, Yann Thanwerdas, Stefan Heyder, Olivier Peltre, Niklas Koep, Yann Cabanes, Thomas Gerald, Paul Chauchat, Daniel Brooks, Bernhard Kainz, Claire Donnat, Susan Holmes, Xavier Pennec - POSTER
    • Fast High-order Tensor Learning Based on Grassmann Manifold - O.KARMOUDA, R.BOYER and J.BOULANGER - POSTER
    • A Geometric Interpretation of Stochastic Gradient Descent in Deep Learning and Bolzmann Machines - Rita Fioresi and Pratik Chaudhari - POSTER
    • Lagrangian and Hamiltonian Dynamics on the Simplex - Goffredo Chirco, Luigi Malago, Giovanni Pistone - POSTER
    • Calibrating Bayesian Neural Networks with Alpha-divergences and Normalizing Flows - Hector J. Hortua, Luigi Malago and Riccardo Volpi - POSTER


    posted in Joint Structures and Common Foundations of Statistical Physics Information Geometry and Inference for Learning (SP+IG'20)read more
  • Geo-Sci-Info

    Registration payment:

    Registration fees for Summer Week is 450 euros, including catering (bedroom and 3 meals a dayon 5 days) and all accommodation on site:
    Registration will be paid at Les Houches reception desk at your arrival by credit card (or VAD payment of your lab).
    Any registration canceled less than two weeks before the arrival date will be due.


    The arrival is Sunday July 26th starting from 3:00 pm. On the day of arrival, only the evening meal is planned. On Sunday, the secretariat is open from 6:00 pm to 7:30 pm. Summer Week will be closed Friday July 31st at 4 pm.

    Access to Les Houches:
    Ecole de Physique des Houches, 149 Chemin de la Côte, F-74310 Les Houches, France Les Houches is a village located in Chamonix valley, in the French Alps. Established in 1951, the Physics School is situated at 1150 m above sea level in natural surroundings, with breathtaking views on the Mont-Blanc mountain range.


    Wednesday afternoon is free. Excursion could be organized to

    · The Mer de Glace (Sea of Ice): It is the largest glacier in France, 7 km long and 200m deep and is one of the biggest attractions in the Chamonix Valley:

    · L’Aiguille du midi: From its height of 3,777m, the Aiguille du Midi and its laid-out terraces offer a 360° view of all the French, Swiss and Italian Alps. A lift brings you to the summit terrace at 3,842m, where you will have a clear view of Mont Blanc:,80,en.html


    posted in Joint Structures and Common Foundations of Statistical Physics Information Geometry and Inference for Learning (SP+IG'20)read more
Internal error.

Oops! Looks like something went wrong!