Group Details
Geometric Science of Information
The objective of this group is to bring together pure/applied mathematicians, physicist and engineers, with common interest for Geometric tools and their applications. It notably aim to organize conferences and to promote collaborative european and international research projects, and diffuse research results on the related domains. It aims to organise conferences, seminar, to promote collaborative local, european and international research project, and to diffuse research results in the the different related interested domains.

GeoSciInfo
Statistics, Information and Topology
Cochairs of the session:
 Michel N'Guiffo Boyom: Université Toulouse
 Pierre Baudot: Median (link)
This session will focus on the advances of information theory, probability and statistics in Algebraic Topology (see [156] bellow). The field is currently knowing an impressive development, both on the side of the categorical, homotopical, or topos foundations of probability theorie and statistics, and of the information functions characterisation in cohomology and homotopy theory.
Bliographicical references: (to be completed)
[1] Cencov, N.N. Statistical Decision Rules and Optimal Inference. Translations of Mathematical Monographs. 1982.
[2] Ay, N. and Jost, J. and Lê, H.V. and Schwachhöfer, L. Information geometry and sufficient statistics. Probability Theory and Related Fields 2015 PDF
[3] Cathelineau, J. Sur l’homologie de sl2 a coefficients dans l’action adjointe, Math. Scand., 63, 5186, 1988. PDF
[4] Kontsevitch, M. The 1+1/2 logarithm. Unpublished note, Reproduced in ElbazVincent & Gangl, 2002, 1995 PDF
[5] ElbazVincent, P., Gangl, H. On poly(ana)logs I., Compositio Mathematica, 130(2), 161214. 2002. PDF
[6] Tomasic, I., Independence, measure and pseudofinite fields. Selecta Mathematica, 12 271306. Archiv. 2006.
[7] Connes, A., Consani, C., Characteristic 1, entropy and the absolute point. preprint arXiv:0911.3537v1. 2009.
[8] Marcolli, M. & Thorngren, R. Thermodynamic Semirings, arXiv 10.4171 / JNCG/159, Vol. abs/1108.2874, 2011.
[9] Abramsky, S., Brandenburger, A., The Sheaftheoretic structure of nonlocality and contextuality, New Journal of Physics, 13 (2011). PDF
[10] Gromov, M. In a Search for a Structure, Part 1: On Entropy, unpublished manuscript, 2013. PDF
[11] McMullen, C.T., Entropy and the clique polynomial, 2013. PDF
[12] Marcolli, M. & Tedeschi, R. Entropy algebras and Birkhoff factorization, arXiv, Vol. abs/1108.2874, 2014.
[13] Doering, A., Isham, C.J., Classical and Quantum Probabilities as Truth Values, arXiv:1102.2213, 2011 PDF
[14] Baez, J.; Fritz, T. & Leinster, T. A Characterization of Entropy in Terms of Information Loss Entropy, 13, 19451957, 2011. PDF
[15] Baez J. C.; Fritz, T. A Bayesian characterization of relative entropy. Theory and Applications of Categories, Vol. 29, No. 16, p. 422456. 2014. PDF
[16] DrummondCole, G.C., Park., J.S., Terrila, J., Homotopy probability theory I. J. Homotopy Relat. Struct. November 2013. PDF
[17] DrummondCole, G.C., Park., J.S., Terrila, J., Homotopy probability theory II. J. Homotopy Relat. Struct. April 2014. PDF
[18] Burgos Gil J.I., Philippon P., Sombra M., Arithmetic geometry of toric varieties. Metrics, measures and heights, Astérisque 360. 2014 . PDF
[19] Gromov, M. Symmetry, probability, entropy. Entropy 2015. PDF
[20] Gromov, M. Morse Spectra, Homology Measures, Spaces of Cycles and Parametric Packing Problems, april 2015. PDF
[21] Park., J.S., Homotopy theory of probability spaces I: classical independence and homotopy Lie algebras. Archiv . 2015
[22] Baudot P., Bennequin D. The homological nature of entropy. Entropy, 17, 166; 2015. PDF
[23] ElbazVincent, P., Gangl, H., Finite polylogarithms, their multiple analogues and the Shannon entropy. (2015) Vol. 9389 Lecture Notes in Computer Science. 277285, Archiv.
[24] M. Marcolli, Information algebras and their applications. International Conference on Geometric Science of Information (2015), 271276
[25] M. Nguiffo Boyom, FoliationsWebsHessian GeometryInformation GeometryEntropy and Cohomology. Entropy 18(12): 433 (2016) PDF
[26] M. Nguiffo Boyom, A. Zeglaoui, Amari Functors and Dynamics in Gauge Structures. GSI 2017: 170178
[27] G.C. DrummondCole, Terila, Homotopy probability theory on a Riemannian manifold and the Euler equation , New York Journal of Mathematics, Volume 23 (2017) 10651085. PDF
[28] P. Forré,, JM. Mooij. Constraintbased Causal Discovery for NonLinear Structural Causal Models with Cycles and Latent Confounders. In A. Globerson, & R. Silva (Eds.) (2018), pp. 269278)
[29] T. Fritz and P. Perrone, Bimonoidal Structure of Probability Monads. Proceedings of MFPS 34, ENTCS, (2018). PDF
[30] JaeSuk Park, Homotopical Computations in Quantum Fields Theory, (2018) arXiv:1810.09100 PDF
[31] G.C. DrummondCole, An operadic approach to operatorvalued free cumulants. Higher Structures (2018) 2, 42–56. PDF
[32] G.C. DrummondCole, A noncrossing word cooperad for free homotopy probability theory. MATRIX Book (2018) Series 1, 77–99. PDF
[33] T. Fritz and P. Perrone, A Probability Monad as the Colimit of Spaces of Finite Samples, Theory and Applications of Categories 34, 2019. PDF.
[34] M. Esfahanian, A new quantum probability theory, quantum information functor and quantum gravity. (2019) PDF
[35] T. Leinster, Entropy modulo a prime, (2019) arXiv:1903.06961 PDF
[36] T. Leinster, E. Roff, The maximum entropy of a metric space, (2019) arXiv:1908.11184 PDF
[37] T. Maniero, Homological Tools for the Quantum Mechanic. arXiv 2019, arXiv:1901.02011. PDF
[38] M. Marcolli, Motivic information, Bollettino dell'Unione Matematica Italiana (2019) 12 (12), 1941
[39] J.P. Vigneaux, Information theory with finite vector spaces, in IEEE Transactions on Information Theory, vol. 65, no. 9, pp. 56745687, Sept. (2019)
[40] Baudot P., Tapia M., Bennequin, D. , Goaillard J.M., Topological Information Data Analysis. (2019), Entropy, 21(9), 869
[41] Baudot P., The PoincaréShannon Machine: Statistical Physics and Machine Learning aspects of Information Cohomology. (2019), Entropy , 21(9),
[42] G. SergeantPerthuis, Bayesian/Graphoid intersection property for factorisation models, (2019), arXiv:1903.06026
[43] J.P. Vigneaux, Topology of Statistical Systems: A Cohomological Approach to Information Theory, PhD Thesis (2019).
[44] Forré, P., & Mooij, J. M. (2019). Causal Calculus in the Presence of Cycles, Latent Confounders and Selection Bias. In A. Globerson, & R. Silva (Eds.), Proceedings of the ThirtyFifth Conference on Uncertainty in Artificial Intelligence: UAI 2019, (2019)
[45] Y. Manin, M. Marcolli Homotopy Theoretic and Categorical Models of Neural Information Networks. arXiv (2020) preprint arXiv:2006.15136
[46] T. Leinster The categorical origins of Lebesgue integration (2020) arXiv:2011.00412 PDF
[47] T. Fritz, T. Gonda, P. Perrone, E. Fjeldgren Rischel, Representable Markov Categories and Comparison of Statistical Experiments in Categorical Probability. (2020) arXiv:2010.07416 PDF
[48] T. Fritz, E. Fjeldgren Rischel, Infinite products and zeroone laws in categorical probability (2020) arXiv:1912.02769 PDF
[49] T. Fritz, A synthetic approach to Markov kernels, conditional independence and theorems on sufficient statistics (2020) arXiv:1908.07021 PDF
[50] T. Fritz and P. Perrone, Stochastic Order on Metric Spaces and the Ordered Kantorovich Monad, Advances in Mathematics 366, 2020. PDF
[51] T. Fritz and P. Perrone, Monads, partial evaluations, and rewriting. Proceedings of MFPS 36, ENTCS, 2020. PDF.
[52] D. Bennequin. G. SergeantPerthuis, O. Peltre, and J.P. Vigneaux, Extrafine sheaves and interaction decompositions, (2020) arXiv:2009.12646
[53] J.P. Vigneaux, Information structures and their cohomology, in Theory and Applications of Categories, Vol. 35, (2020), No. 38, pp 14761529.
[54] O. Peltre, MessagePassing Algorithms and Homology, PhD Thesis (2020), arXiv:2009.11631
[55] G. SergeantPerthuis, Interaction decomposition for presheafs, (2020) arXiv:2008.09029
[56] K. Hess, Topological adventures in neuroscience, in the Proceedings of the 2018 Abel Symposium: Topological Data Analysis, Springer Verlag, (2020).
[57] C. Curto, N. Youngs. Neural ring homomorphisms and maps between neural codes. Submitted. arXiv.org preprint.
[58] N.C. Combe, Y, Manin, Fmanifolds and geometry of information, arXiv:2004.08808v.2, (2020) Bull. London MS. 
GeoSciInfo
Topology and geometry in neuroscience
chairs of the sessions
Topics
This session will focus on the advances on Algebraic Topology and Geometrical methods in neurosciences (see [1105] bellow, among many others). The field is currently knowing an impressive development coming both:_ from theoretical neuroscience and machine learning fields, like Graph Neural Networks [3042], Bayesian geometrical inference [2729], Message Passing, probability and cohomology [9295], Information Topology [5354,6266,96105] or Networks [8385,9091], higher order nbody statistical interactions [67,74,9495,99,101]
_ from topological data analysis applications to real neural recordings, ranging from subcellular [43,51] genetic or omic expressions [81,101], spiking dynamic and neural coding [125,4547,5052,79], to cortical areas fMRI, EEG [26,6772,7680,8489], linguistic [5461] and consciousness [48,53,102].
Bibliographical references: (to be completed)
Carina Curto, Nora Youngs and Vladimir Itskov and colleagues:
[1] C. Curto, N. Youngs. Neural ring homomorphisms and maps between neural codes. Submitted. arXiv.org preprint.
[2] C. Curto, J. Geneson, K. Morrison. Fixed points of competitive thresholdlinear networks. Neural Computation, in press, 2019. arXiv.org preprint.
[3] C. Curto, A. VelizCuba, N. Youngs. Analysis of combinatorial neural codes: an algebraic approach. Book chapter in Algebraic and Combinatorial Computational Biology. R. Robeva, M. Macaulay (Eds), 2018.
[4] C. Curto, V. Itskov. Combinatorial neural codes. Handbook of Discrete and Combinatorial Mathematics, Second Edition, edited by Kenneth H. Rosen, CRC Press, 2018. pdf
[5] C. Curto, E. Gross, J. Jeffries, K. Morrison, M. Omar, Z. Rosen, A. Shiu, N. Youngs. What makes a neural code convex? SIAM J. Appl. Algebra Geometry, vol. 1, pp. 222238, 2017. pdf, SIAGA link, and arXiv.org preprint
[6] C. Curto. What can topology tells us about the neural code? Bulletin of the AMS, vol. 54, no. 1, pp. 6378, 2017. pdf, Bulletin link.
[7] C. Curto, K. Morrison. Pattern completion in symmetric thresholdlinear networks. Neural Computation, Vol 28, pp. 28252852, 2016. pdf, arXiv.org preprint.
[8] C. Giusti, E. Pastalkova, C. Curto, V. Itskov. Clique topology reveals intrinsic geometric structure in neural correlations. PNAS, vol. 112, no. 44, pp. 1345513460, 2015. pdf, PNAS link.
[9] C. Curto, A. Degeratu, V. Itskov. Encoding binary neural codes in networks of thresholdlinear neurons. Neural Computation, Vol 25, pp. 28582903, 2013. pdf, arXiv.org preprint.
[10] K. Morrison, C. Curto. Predicting neural network dynamics via graphical analysis. Book chapter in Algebraic and Combinatorial Computational Biology. R. Robeva, M. Macaulay (Eds), 2018. arXiv.org preprint,
[11] C. Curto, V. Itskov, A. VelizCuba, N. Youngs. The neural ring: an algebraic tool for analyzing the intrinsic structure of neural codes. Bulletin of Mathematical Biology, Volume 75, Issue 9, pp. 15711611, 2013. arXiv.org preprint.
[12] C. Curto, V. Itskov, K. Morrison, Z. Roth, J.L. Walker. Combinatorial neural codes from a mathematical coding theory perspective. Neural Computation, Vol 25(7):18911925, 2013. arXiv.org preprint.
[13] C. Curto, A. Degeratu, V. Itskov. Flexible memory networks. Bulletin of Mathematical Biology, Vol 74(3):590614, 2012. arXiv.org preprint.
[14] V. Itskov, C. Curto, E. Pastalkova, G. Buzsaki. Cell assembly sequences arising from spike threshold adaptation keep track of time in the hippocampus. Journal of Neuroscience, Vol. 31(8):28282834, 2011.
[15] K.D. Harris, P. Bartho, et al.. How do neurons work together? Lessons from auditory cortex. Hearing Research, Vol. 271(12), 2011, pp. 3753.
[16] P. Bartho, C. Curto, A. Luczak, S. Marguet, K.D. Harris. Population coding of tone stimuli in auditory cortex: dynamic rate vector analysis. European Journal of Neuroscience, Vol. 30(9), 2009, pp. 17671778.
[17] C. Curto, V. Itskov. Cell groups reveal structure of stimulus space. PLoS Computational Biology, Vol. 4(10): e1000205, 2008.
[18] E. Gross , N. K. Obatake , N. Youngs, Neural ideals and stimulus space visualization, Adv. Appl.Math., 95 (2018), pp. 65–95.
[19] C. Giusti, V. Itskov. A nogo theorem for onelayer feedforward networks. Neural Computation, 26 (11):25272540, 2014.
[20] V. Itskov, L.F. Abbott. Capacity of a Perceptron for Sparse Discrimination . Phys. Rev. Lett. 101(1), 2008.
[21] V. Itskov, E. Pastalkova, K. Mizuseki, G. Buzsaki, K.D. Harris. Thetamediated dynamics of spatial information in hippocampus. Journal of Neuroscience, 28(23), 2008.
[22] V. Itskov, C. Curto, K.D. Harris. Valuations for spike train prediction. Neural Computation, 20(3), 644667, 2008.
[23] E. Pastalkova, V. Itskov , A. Amarasingham , G. Buzsaki. Internally Generated Cell Assembly Sequences in the Rat Hippocampus. Science 321(5894):1322  1327, 2008.
[24] V. Itskov, A. Kunin, Z. Rosen. Hyperplane neural codes and the polar complex. To appear in the Abel Symposia proceedings, Vol. 15, 2019.Alexander Ruys de Perez and colleagues:
[25] A. Ruys de Perez, L.F. Matusevich, A. Shiu, Neural codes and the factor complex, Advances in Applied Mathematics 114 (2020).
Sunghyon Kyeong and colleagues:
[26] Sunghyon Kyeong, Seonjeong Park, KeunAh Cheon, JaeJin Kim, DongHo Song, and Eunjoo Kim, A New Approach to Investigate the Association between Brain Functional Connectivity and Disease Characteristics of AttentionDeficit/Hyperactivity Disorder: Topological Neuroimaging Data Analysis, PLOS ONE, 10 (9): e0137296, DOI: 10.1371/journal.pone.0137296 (2015)
Jonathan Pillow and colleagues:
[27] Aoi MC & Pillow JW (2017). Scalable Bayesian inference for highdimensional neural receptive fields. bioRxiv 212217; doi: https://doi.org/10.1101/212217
[28] Aoi MC, Mante V, & Pillow JW. (2020). Prefrontal cortex exhibits multidimensional dynamic encoding during decisionmaking. Nat Neurosci.
[29] Calhoun AJ, Pillow JW, & Murthy M. (2019). Unsupervised identification of the internal states that shape natural behavior. Nature Neuroscience 22:204020149.
[30] Dong X, Thanou D, Toni L, et al., 2020, Graph Signal Processing for Machine Learning: A Review and New Perspectives, Ieee Signal Processing Magazine, Vol:37, ISSN:10535888, Pages:117127Michael Bronstein, Federico Monti, Giorgos Bouritsas and colleagues:
[31] G. Bouritsas, F. Frasca, S Zafeiriou, MM Bronstein, Improving graph neural network expressivity via subgraph isomorphism counting. arXiv (2020) preprint arXiv:2006.09252
[32] M. Bronstein , G. Pennycook, L. Buonomano, T.D. Cannon, Belief in fake news, responsiveness to cognitive conflict, and analytic reasoning engagement, Thinking and Reasoning (2020), ISSN: 13546783
[33] X. Dong, D. Thanou, L. Toni, M. Bronstein, P. Frossard, Graph Signal Processing for Machine Learning: A Review and New Perspectives, IEEE Signal Processing Magazine (2020), Vol: 37, Pages: 117127, ISSN: 10535888
[34] Y. Wang, Y. Sun, Z. Liu, S.E. Sarma, M. Bronstein, J.M. Solomon, Dynamic Graph CNN for Learning on Point Clouds, ACM Transactions on graphics (2020), Vol: 38, ISSN: 07300301
[35] M. Bronstein, J. Everaert, A. Castro, J. Joormann, T. D. Cannon, Pathways to paranoia: Analytic thinking and belief flexibility., Behav Res Ther (2019), Vol: 113, Pages: 1824
[36] G. Bouritsas, S. Bokhnyak, S. Ploumpis, M. Bronstein, S. Zafeiriou, Neural 3D Morphable Models: Spiral Convolutional Networks for 3D Shape Representation Learning and Generation, (2019) IEEE/CVF ICCV 2019, 7212
[37] O. Litany, A. Bronstein, M. Bronstein, A. Makadia et al., Deformable Shape Completion with Graph Convolutional Autoencoders (2018), Pages: 18861895, ISSN: 10636919
[38] R. Levie, F. Monti, X. Bresson X, M. Bronstein, CayleyNets: Graph Convolutional Neural Networks with Complex Rational Spectral Filters, IEEE Transactions on Signal Processing (2018), Vol: 67, Pages: 97109, ISSN: 1053587X
[39] F. Monti, K. Otness, M. Bronstein, Motifnet: a motifbased graph convolutional network for directed graphs (2018), Pages: 225228
[40] F. Monti, M. Bronstein, X. Bresson, Geometric matrix completion with recurrent multigraph neural networks, Neural Information Processing Systems (2017), Pages: 37003710, ISSN: 10495258
[41] F. Monti F, D. Boscaini, J. Masci, E. Rodola, J. Svoboda, M. Bronstein, Geometric deep learning on graphs and manifolds using mixture model CNNs, (2017) IEEE Conference on Computer Vision and Pattern Recognition, p: 33
[42] M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, P. Vandergheynst et al., Geometric Deep Learning Going beyond Euclidean data, IEEE Signal Processing Magazine (2017), Vol: 34, Pages: 1842, ISSN: 10535888Kathryn Hess and colleagues:
[43] L. Kanari, H. Dictus, W. Van Geit, A. Chalimourda, B. Coste, J. Shillcock, K. Hess, and H. Markram, Computational synthesis of cortical dendritic morphologies, bioRvix (2020) 10.1101/2020.04.15.040410, submitted.
[44] G. Tauzin, U. Lupo, L. Tunstall, J. Burella Prez, M. Caorsi, A. MedinaMardones, A, Dassatti, and K. Hess, giottotda: a topological data analysis toolkit for machine learning and data exploration, arXiv:2004.02551
[45] E. Mullier, J. Vohryzek, A. Griffa, Y. AlemànGómez, C. Hacker, K. Hess, and P. Hagmann, Functional brain dynamics are shaped by connectome nsimplicial organization, (2020) submitted.
[46] M. Fournier, M. Scolamiero, etal., Topology predicts longterm functional outcome in early psychosis, Molecular Psychiatry (2020). https://doi.org/10.1038/s4138002008261.
[47] K. Hess, Topological adventures in neuroscience, in the Proceedings of the 2018 Abel Symposium: Topological Data Analysis, Springer Verlag, (2020).
[48] A. Doerig, A. Schurger, K. Hess, and M. H. Herzog, The unfolding argument: why IIT and other causal structure theories of consciousness are empirically untestable, Consciousness and Cognition 72 (2019) 4959.
[49] L. Kanari, S. Ramaswamy, et al., Objective classification of neocortical pyramidal cells, Cerebral Cortex (2019) bhy339, https://doi.org/10.1093/cercor/bhy339.
[50] J.B. Bardin, G. Spreemann, K. Hess, Topological exploration of artificial neuronal network dynamics, Network Neuroscience (2019) https://doi.org/10.1162/netn_a_00080.
[51] L. Kanari, P. Dłotko, M. Scolamiero, R. Levi, J. C. Shillcock, K. Hess, and H. Markram, A topological representation of branching morphologies, Neuroinformatics (2017) doi: 10.1007/s1202101793411.
[52] M. W. Reimann, M. Nolte,et al., Cliques of neurons bound into cavities provide a missing link between structure and function, Front. Comput. Neurosci., 12 June (2017), doi: 10.3389/fncom.2017.00048.Mathilde Marcoli, Yuri Manin, and colleagues:
[53] Y. Manin, M. Marcolli Homotopy Theoretic and Categorical Models of Neural Information Networks. arXiv (2020) preprint arXiv:2006.15136
[54] M. Marcolli, Lumen Naturae: Visions of the Abstract in Art and Mathematics, MIT Press (2020)
[55] A. Port, T. Karidi, M. Marcolli, Topological Analysis of Syntactic Structures (2019) arXiv preprint arXiv:1903.05181
[56] M. Marcolli, Motivic information, Bollettino dell'Unione Matematica Italiana (2019) 12 (12), 1941
[57] A. Port, I. Gheorghita, D. Guth, J.M. Clark, C. Liang, S. Dasu, M. Marcolli, Persistent topology of syntax, Mathematics in Computer Science (2018) 12 (1), 3350 20
[58] K. Shu, S. Aziz, VL Huynh, D Warrick, M Marcolli, Syntactic phylogenetic trees, Foundations of Mathematics and Physics One Century After Hilbert (2018), 417441
[59] K. Shu, A. Ortegaray, R Berwick, M Marcolli Phylogenetics of IndoEuropean language families via an algebrogeometric analysis of their syntactic structures. arXiv (2018) preprint arXiv:1712.01719
[60] K. Shu, M. Marcolli, Syntactic structures and code parameters Mathematics in Computer Science (2018) 11 (1), 7990
[61] K Siva, J Tao, M Marcolli. Syntactic Parameters and Spin Glass Models of Language Change Linguist. Anal (2017) 41 (34), 559608
[62] M. Marcolli, N. Tedeschi, Entropy algebras and Birkhoff factorization. Journal of Geometry and Physics (2015) 97, 243265
[63] M. Marcolli, Information algebras and their applications. International Conference on Geometric Science of Information (2015), 271276
[64] K. Siva, J. Tao, M. Marcolli Spin glass models of syntax and language evolution, arXiv preprint (2015) arXiv:1508.00504
[65] Y. Manin, M. Marcolli, Kolmogorov complexity and the asymptotic bound for errorcorrecting codes Journal of Differential Geometry (2014) 97 (1), 91108
[66] M. Marcolli, R. Thorngren, Thermodynamic semirings, ArXiv preprint (2011) arXiv:1108.2874Bosa Tadić and colleagues:
[67] M. Andjelkovic, B. Tadic, R. Melnik, The topology of higherorder complexes associated with brainfunction hubs in human connectomes , available on arxiv.org/abs/2006.10357, published in Scientific Reports 10:17320 (2020)
[68] B. Tadic, M. Andjelkovic, M. Suvakov, G.J. Rodgers, Magnetisation Processes in Geometrically Frustrated Spin Networks with SelfAssembled Cliques, Entropy 22(3), 336 (2020)
[69] B. Tadic, M. Andjelkovic, R. Melnik, Functional Geometry of Human Connectomes published in ScientificReports Nature:ScientificReports 9:12060 (2019) previous version: Functional Geometry of Human Connectome and Robustness of Gender Differences, arXiv preprint arXiv:1904.03399 April 6, 2019
[70] B. Tadic, M. Andjelkovic, M. Suvakov, Origin of hyperbolicity in braintobrain coordination networks, FRONTIERS in PHYSICS vol.6, ARTICLE{10.3389/fphy.2018.00007}, (2018) OA
[71] B. Tadic, M. Andjelkovic, Algebraic topology of multibrain graphs: Methods to study the social impact and other factors onto functional brain connections, in Proceedings of BELBI (2016)
[72] B. Tadic, M. Andjelkovic, B.M. Boskoska, Z. Levnajic, Algebraic Topology of MultiBrain Connectivity Networks Reveals Dissimilarity in Functional Patterns during Spoken Communications, PLOS ONE Vol 11(11), e0166787 (2016)
[73] M. Mitrovic and B. Tadic, Search for Weighted Subgraphs on Complex Networks with MLM, Lecture Notes in Computer Science, Vol. 5102 pp. 551558 (2008)Giovanni Petri, Francesco Vaccarino and collaborators:
[74] F. Battiston, G. Cencetti, et al., Networks beyond pairwise interactions: structure and dynamics, Physics Reports (2020), arXiv:2006.01764
[75] M. Guerra, A. De Gregorio, U. Fugacci, G. Petri, F. Vaccarino, Homological scaffold via minimal homology bases. arXiv (2020) preprint arXiv:2004.11606
[76] J. Billings, R. Tivadar, M.M. Murray, B. Franceschiello, G. Petri, Topological Features of Electroencephalography are ReferenceInvariant, bioRxiv 2020
[77] J. Billings, M. Saggar, S. Keilholz, G. Petri, Topological Segmentation of TimeVarying Functional Connectivity Highlights the Role of Preferred Cortical Circuits, bioRxiv 2020
[78] E. IbáñezMarcelo, L. Campioni, et al., Topology highlights mesoscopic functional equivalence between imagery and perception: The case of hypnotizability. NeuroImage (2019) 200, 437449
[79] P. Expert, L.D. Lord, M.L. Kringelbach, G. Petri. Topological neuroscience. Network Neuroscience (2019) 3 (3), 653655
[80] C. Geniesse, O. Sporns, G. Petri, M. Saggar, Generating dynamical neuroimaging spatiotemporal representations (DyNeuSR) using topological data analysis. Network Neuroscience (2019) 3 (3), 763778
[81] A. Patania, P. Selvaggi, M. Veronese, O. Dipasquale, P. Expert, G. Petri, Topological gene expression networks recapitulate brain anatomy and function. Network Neuroscience (2019) 3 (3), 744762
[82] E. Ibáñez‐Marcelo, L. Campioni, D.et al.. Spectral and topological analyses of the cortical representation of the head position: Does hypnotizability matter? Brain and behavior (2018) 9 (6), e01277
[83] G. Petri, A. Barrat, Simplicial activity driven model, Physical review letters 121 (22), 228301
[84] A. Phinyomark, E. IbanezMarcelo, G. Petri. Restingstate fmri functional connectivity: Big data preprocessing pipelines and topological data analysis. IEEE Transactions on Big Data (2017) 3 (4), 415428
[85] G. Petri, S. Musslick, B. Dey, K. Ozcimder, D. Turner, N.K. Ahmed, T. Willke. Topological limits to parallel processing capability of network architectures. arXiv preprint (2017) arXiv:1708.03263
[86] K. Ozcimder, B. Dey, S. Musslick, G. Petri, N.K. Ahmed, T.L. Willke, J.D. Cohen, A Formal Approach to Modeling the Cost of Cognitive Control, arXiv preprint (2017) arXiv:1706.00085
[87] L.D. Lord, P. Expert, et al. , Insights into brain architectures from the homological scaffolds of functional connectivity networks, Frontiers in systems neuroscience (2016) 10, 85
[88] J. Binchi, E. Merelli, M. Rucco, G. Petri, F. Vaccarino. jHoles: A Tool for Understanding Biological Complex Networks via Clique Weight Rank Persistent Homology. Electron. Notes Theor. Comput. Sci. (2014) 306, 518
[89] G. Petri, P. Expert, F. Turkheimer, R. CarhartHarris, D. Nutt, P.J. Hellyer et al., Homological scaffolds of brain functional networks. Journal of The Royal Society Interface (2014) 11 (101), 20140873
[90] G. Petri, M. Scolamiero, I. Donato, F. Vaccarino, Topological strata of weighted complex networks. PloS one (2013) 8 (6), e66506
[91] G. Petri, M. Scolamiero, I. Donato, ., Networks and cycles: a persistent homology approach to complex networks Proceedings of the european conference on complex systems (2013), 9399Daniel Bennequin, JuanPablo Vigneaux, Olivier Peltre, Pierre Baudot and colleagues:
[92] D. Bennequin. G. SergeantPerthuis, O. Peltre, and J.P. Vigneaux, Extrafine sheaves and interaction decompositions, (2020) arXiv:2009.12646
[93] O. Peltre, MessagePassing Algorithms and Homology, PhD Thesis (2020), arXiv:2009.11631
[94] G. SergeantPerthuis, Interaction decomposition for presheafs, (2020) arXiv:2008.09029
[95] G. SergeantPerthuis, Bayesian/Graphoid intersection property for factorisation models, (2019), arXiv:1903.06026
[96] J.P. Vigneaux, Topology of Statistical Systems: A Cohomological Approach to Information Theory, PhD Thesis (2019).
[97] J.P. Vigneaux, Information structures and their cohomology, in Theory and Applications of Categories, Vol. 35, (2020), No. 38, pp 14761529.
[98] J.P. Vigneaux, Information theory with finite vector spaces, in IEEE Transactions on Information Theory, vol. 65, no. 9, pp. 56745687, Sept. (2019)
[99] Baudot P., Tapia M., Bennequin, D. , Goaillard J.M., Topological Information Data Analysis. (2019), Entropy, 21(9), 869
[100] Baudot P., The PoincaréShannon Machine: Statistical Physics and Machine Learning aspects of Information Cohomology. (2019), Entropy , 21(9),
[101] Tapia M., Baudot P., et al. Neurotransmitter identity and electrophysiological phenotype are genetically coupled in midbrain dopaminergic neurons. Scientific Reports. (2018). BioArXiv168740
[102] Baudot P., Elements of qualitative cognition: an Information Topology Perspective. Physics of Life Reviews. (2019) Arxiv. arXiv:1807.04520
[103] Baudot P., Bennequin D., The homological nature of entropy. Entropy, (2015), 17, 166; doi:10.3390
[104] D. Bennequin. Remarks on Invariance in the Primary Visual Systems of Mammals, pages 243–333. Neuromathematics of Vision Part of the series Lecture Notes in Morphogenesis Springer, 2014.
[105] Baudot P., Bennequin D., Information Topology I and II. Random models in Neuroscience (2012) 
GeoSciInfo
Max WELLING
Informatics Institute, University of Amsterdam and Qualcomm Technologies
https://staff.fnwi.uva.nl/m.welling/
ELLIS Board Member (European Laboratory for Learning and Intelligent Systems: https://ellis.eu/)
Title: Exploring Quantum Statistics for Machine Learning
Abstract: Quantum mechanics represents a rather bizarre theory of statistics that is very different from the ordinary classical statistics that we are used to. In this talk I will explore if there are ways that we can leverage this theory in developing new machine learning tools: can we design better neural networks by thinking about entangled variables? Can we come up with better samplers by viewing them as observations in a quantum system? Can we generalize probability distributions? We hope to develop better algorithms that can be simulated efficiently on classical computers, but we will naturally also consider the possibility of much faster implementations on future quantum computers. Finally, I hope to discuss the role of symmetries in quantum theories.
Reference:
Roberto Bondesan, Max Welling, Quantum Deformed Neural Networks, arXiv:2010.11189v1 [quantph], 21st October 2020 ; https://arxiv.org/abs/2010.11189 
GeoSciInfo
Welcome to “Geometric Science of Information” 202 Conference
On behalf of both the organizing and the scientific committees, it is our great pleasure to welcome all delegates, representatives and participants from around the world to the fifth International SEE conference on “Geometric Science of Information” (GSI’21), sheduled in July 2021.
GSI’21 benefits from scientific sponsor and financial sponsors.
The 3day conference is also organized in the frame of the relations set up between SEE and scientific institutions or academic laboratories such as Ecole Polytechnique, Ecole des Mines ParisTech, INRIA, CentraleSupélec, Institut Mathématique de Bordeaux, Sony Computer Science Laboratories.
The GSI conference cycle has been initiated by the Brillouin Seminar Team as soon as 2009. The GSI’21 event has been motivated in the continuity of first initiatives launched in 2013 (https://www.see.asso.fr/gsi2013) at Mines PatisTech, consolidated in 2015 (https://www.see.asso.fr/gsi2015) at Ecole Polytechnique and opened to new communities in 2017 (https://www.see.asso.fr/gsi2017) at Mines ParisTech and 2019 (https://www.see.asso.fr/gsi2019) at ENAC Toulouse. We mention that in 2011, we organized an indofrench workshop on “Matrix Information Geometry” that yielded an edited book in 2013, and in 2017, collaborate to CIRM seminar in Luminy TGSI’17 “Topoplogical & Geometrical Structures of Information” (http://forum.csdc.org/category/94/tgsi2017). Last GSI’19 Proceedings have been edited by SPRINGER in Lecture Notes (https://www.springer.com/gp/book/9783030269791).
GSI satellites event have been organized in 2019 and 2020 as, FGSI’19 “Foundation of Geometric Science of Information” in Montpellier and Les Houches Seminar SPIGL’20 “Joint Structures and Common Foundations of Statistical Physics, Information Geometry and Inference for Learning” .
The technical program of GSI’21 covers all the main topics and highlights in the domain of “Geometric Science of Information” including Information Geometry Manifolds of structured data/information and their advanced applications. This proceedings consists solely of original research papers that have been carefully peerreviewed by two or three experts before, and revised before acceptance.
Historical background
As for the GSI’13, GSI’15, GSI’17, and GSI’19 GSI'21 addresses interrelations between different mathematical domains like shape spaces (geometric statistics on manifolds and Lie groups, deformations in shape space, ...), probability/optimization & algorithms on manifolds (structured matrix manifold, structured data/Information, ...), relational and discrete metric spaces (graph metrics, distance geometry, relational analysis,...), computational and Hessian information geometry, geometric structures in thermodynamics and statistical physics, algebraic/infinite dimensional/Banach information manifolds, divergence geometry, tensorvalued morphology, optimal transport theory, manifold & topology learning, ... and applications like geometries of audioprocessing, inverse problems and signal/image processing. GSI’21 topics were enriched with contributions from Lie Group Machine Learning, Harmonic Analysis on Lie Groups, Geometric Deep Learning, Geometry of Hamiltonian Monte Carlo, Geometric & (Poly)Symplectic Integrators, Contact Geometry & Hamiltonian Control, Geometric and structure preserving discretizations, Probability Density Estimation & Sampling in High Dimension, Geometry of Graphs and Networks and Geometry in Neuroscience & Cognitive Sciences.
At the turn of the century, new and fruitful interactions were discovered between several branches of science: Information Science (information theory, digital communications, statistical signal processing,), Mathematics (group theory, geometry and topology, probability, statistics, sheaves theory,...) and Physics (geometric mechanics, thermodynamics, statistical physics, quantum mechanics, ...). GSI conference cycle is a tentative to discover joint mathematical structures to all these disciplines by elaboration of a “General Theory of Information” embracing physics science, information science, and cognitive science in a global scheme.
Frank Nielsen, cochair : Ecole Polytechnique, Palaiseau, France, Sony Computer Science Laboratories, Tokyo, Japan
Frédéric Barbaresco, cochair: President of SEE ISIC Club (Ingénierie des Systèmes d'Information et de Communications),
Representative of KTD PCC (Key Technology Domain / Processing, Computing & Cognition ) Board, THALES LAND & AIR SYSTEMS, France

GeoSciInfo
As for GSI’13, GSI’15, GSI’17 and GSI’19, the objective of this SEE GSI’21 conference, hosted in PARIS, is to bring together pure/applied mathematicians and engineers, with common interest for Geometric tools and their applications for Information analysis.
It emphasizes an active participation of young researchers to discuss emerging areas of collaborative research on “Geometric Science of Information and their Applications”.
Current and ongoing uses of Information Geometry Manifolds in applied mathematics are the following: Advanced Signal/Image/Video Processing, Complex Data Modeling and Analysis, Information Ranking and Retrieval, Coding, Cognitive Systems, Optimal Control, Statistics on Manifolds, Topology/Machine/Deep Learning, Artificial Intelligence, Speech/sound recognition, natural language treatment, Big Data Analytics, Learning for Robotics, etc., which are substantially relevant for industry.
The Conference will be therefore held in areas of topics of mutual interest with the aim to:
• Provide an overview on the most recent stateoftheart
• Exchange mathematical information/knowledge/expertise in the area
• Identify research areas/applications for future collaborationProvisional topics of interests:
 Geometric Deep Learning (ELLIS session)
 Probability on Riemannian Manifolds
 Optimization on Manifold
 Shape Space & Statistics on nonlinear data
 Lie Group Machine Learning
 Harmonic Analysis on Lie Groups
 Statistical Manifold & Hessian Information Geometry
 Monotone Embedding in Information Geometry
 Nonparametric Information Geometry
 Computational Information Geometry
 Distance and Divergence Geometries
 Divergence Statistics
 Optimal Transport & Learning
 Geometry of Hamiltonian Monte Carlo
 Statistics, Information & Topology
 Graph Hyperbolic Embedding & Learning
 Inverse problems: Bayesian and Machine Learning interaction
 Integrable Systems & Information Geometry
 Geometric structures in thermodynamics and statistical physics
 Contact Geometry & Hamiltonian Control
 Geometric and structure preserving discretizations
 Geometric & Symplectic Methods for Quantum Systems
 Geometry of Quantum States
 Geodesic Methods with Constraints
 Probability Density Estimation & Sampling in High Dimension
 Geometry of TensorValued Data
 Geometric Mechanics
 Geometric Robotics & Learning
 Topological and geometrical structures in neurosciences
A special session will deal with:
 Geometric Structures Coding & Learning Libraries (geomstats, pyRiemann , Pot…)
Advanced information on article submission and publication
As for previous editions, GSI’21 Proceedings will be published in SPRINGER LNCS. See GSI’19 Proceedings
8 pages SPRINGER LNCS format is required for Initial paper submission.
A detailed call for contributions will be published shortly.
CALL FOR PAPERS 
GeoSciInfo
Ph.D. and Postdoc positions in Applied Mathematics
FriedrichAlexanderUniversität ErlangenNürnberg, Erlangen, Germany
Application deadline: January 10th, 2021
The Group
The Chair of Applied Analysis – Alexander von Humboldt Professorship at the Department of Mathematics of the FriedrichAlexanderUniversität ErlangenNürnberg (FAU), in Erlangen (Germany), led by Prof. Dr. Enrique Zuazua, is looking for outstanding candidates to fill several
Ph.D. and Postdoctoral positions
In the broad area of Applied Mathematics, the Chair develops and applies methods of Analysis, Computational Mathematics and Data Sciences to model, understand and control the dynamics of various phenomena arising in the interphase of Mathematics with Engineering, Physics, Biology and Social Sciences.
We welcome applications by young and highly motivated scientists to contribute to this exciting joint AvHFAU effort. Possible research projects include but are not limited to:
 Analysis of Partial Differential Equations (PDE).
 The interplay between Data Sciences, numerics of PDE and Control Systems.
 Control of diffusion models arising in Biology and Social Sciences.
 Modelling and control of multiagent systems.
 Hyperbolic models arising in traffic flow and energy transport.
 Waves in networks and Markov chains.
 Fractional PDE.
 Optimal design in Material Sciences.
 Micromacro limit processes.
 The interplay between discrete and continuous modelling in design and control.
 The emergence of turnpike phenomena in longtime horizons.
 Inversion and parameter identification.
 Recommendation systems.
 Development of new computation tools and software.
We look for excellent candidates with expertise in the areas of applied mathematics, PDE analysis, control theory, numerical analysis, data sciences and computational mathematics who enjoy interdisciplinary work.
The Chair contributes to the development of a new Center of Research at FAU, in the broad area of “Mathematics of Data”, conceived as a highly visible interdisciplinary research site, an incubator for future collaborative research grants and a turntable for the key research priorities of FAU. The recruited candidates will have the added opportunity to participate in this challenging endeavour.
How to apply
Applications, including cover/motivation letter, curriculum vitae, list of publications, statement of research and two or three names of experts for reference should be submitted via email as a single pdf file to secretaryaa[at]math.fau.de before January 10th, 2012.
Any inquiries about the positions should be sent to Prof. Enrique Zuazua (positionsaa[at]math.fau.de). Applications will be accepted until the positions are filled.
FAU is a member of “The Family in Higher Education Institutions” best practice club and also aims to increase the number of women in scientific positions. Female candidates are therefore particularly encouraged to apply. In case of equal qualifications, candidates with disabilities will take precedence.
For more detailed information about the Chair, please visit Chair of Applied Analysis – Alexander von Humboldt Professorship

GeoSciInfo
GeomLoss : Geometric Loss functions between sampled measures, images and volumes
Find all the docs and tutorials of the version 0.2.3 in the read the docs website:
N.B.: This is still an alpha release! Please send me your feedback: I will polish the user interface, implement Hausdorff divergences, add support for meshes, images, volumes and clean the documentation over the summer of 2020.
The GeomLoss library provides efficient GPU implementations for:

Kernel norms (also known as Maximum Mean Discrepancies).

Hausdorff divergences, which are positive definite generalizations of the ICP loss, analogous to loglikelihoods of Gaussian Mixture Models.

Unbiased Sinkhorn divergences, which are cheap yet positive definite approximations of Optimal Transport (Wasserstein) costs.
These loss functions, defined between positive measures, are available through the custom PyTorch layers SamplesLoss, ImagesLoss and VolumesLoss which allow you to work with weighted point clouds (of any dimension), density maps and volumetric segmentation masks. Geometric losses come with three backends each:

A simple tensorized implementation, for small problems (< 5,000 samples).

A reference online implementation, with a linear (instead of quadratic) memory footprint, that can be used for finely sampled measures.

A very fast multiscale code, which uses an octreelike structure for largescale problems in dimension <= 3.
GeomLoss is a simple interface for cuttingedge Optimal Transport algorithms. It provides:
 Support for batchwise computations.
 Linear (instead of quadratic) memory footprint for large problems, relying on the KeOps library for mapreduce operations on the GPU.
 Fast kernel truncation for small bandwidths, using an octreebased structure.
 Logdomain stabilization of the Sinkhorn iterations, eliminating numeric overflows for small values of 𝜀
 Efficient computation of the gradients, which bypasses the naive backpropagation algorithm.
 Support for unbalanced Optimal Transport, with a softening of the marginal constraints through a maximum reach parameter.
 Support for the εscaling heuristic in the Sinkhorn loop, with kernel truncation in dimensions 1, 2 and 3. On typical 3D problems, our implementation is 50100 times faster than the standard SoftAssign/Sinkhorn algorithm.
Note, however, that SamplesLoss does not implement the Fast Multipole or Fast Gauss transforms. If you are aware of a wellpackaged implementation of these algorithms on the GPU, please contact me!
The divergences implemented here are all symmetric, positive definite and therefore suitable for measurefitting applications. For positive input measures 𝛼 and 𝛽, our Loss
functions are such that
Loss(𝛼,𝛽) = Loss(𝛽,𝛼),
0 = Loss(𝛼,𝛼) ⩽ Loss(𝛼,𝛽),
0 = Loss(𝛼,𝛽) ⟺ 𝛼=𝛽.GeomLoss can be used in a wide variety of settings, from shape analysis (LDDMM, optimal transport…) to machine learning (kernel methods, GANs…) and image processing. Details and examples are provided below:
GeomLoss is licensed under the MIT license.
Author and Contributors
Feel free to contact us for any bug report or feature request:
 Jean Feydy
 Pierre Roussillon (extensions to brain tractograms and normal cycles)
Related projects
You may be interested by:

The KeOps library, which provides efficient CUDA routines for point cloud processing, with full PyTorch support.

Rémi Flamary and Nicolas Courty’s Python Optimal Transport library, which provides a reference implementation of OTrelated methods for small problems.

Bernhard Schmitzer’s Optimal Transport toolbox, which provides a reference multiscale solver for the OT problem, on the CPU.


GeoSciInfo
12 Fully funded (4yrs) PhD position on AI/machine learning @ UiT The Arctic University of Norway
12 Fully funded (4yrs) PhD position on AI/machine learning with the Department of Computer Science, UiT The Arctic University of Norway.
Application Link  https://www.jobbnorge.no/en/availablejobs/job/192788/12phdfellowsincomputerscienceartificialintelligenceforvirtualstainingoflabelfreecellandtissueimages
Deadline  18th October 2020
Location Tromsø, NorwayQualification:
These positions require a Master’s degree or equivalent in Computer Science, or Mathematics and Computing. In addition, the candidates must have:Experience of working with computer vision and deep learning toolkits on at least one of the following platforms – Python, C/C++, MATLAB, Keras, PyTorch, Tensor Flow
Demonstration of programming proficiency in at least two of the following platforms: Python, C/C++, MATLAB, OpenCV, etc.
Postgraduate coursework or master thesis strongly related to at least four of the following topics:
 Machine learning/deep learning
 Computer vision
 Optimization theory/ convex optimization/computational optimization
 Linear algebra
 Statistics/statistical machine learning
 Computational modelling of differential and integral equations
 Data science
 GPU programming
 Neural networks
 Distributed learning/extreme learning
Requirement:
Your application must include:
Cover letter explaining your motivation and research interests
CV  summarizing education, positions and academic work
Diplomas and transcripts from completed Bachelor’s and Master’s degrees
Documentation of English proficiency
13 references with contact details
Master thesis, and any other academic works
Documentation has to be in English or a Scandinavian language. We only accept applications through Jobbnorge.Remuneration 
approx. 48,000 Euro per annum (Remuneration of the PhD position is in State salary scale code 1017. A compulsory contribution of 2% to the Norwegian Public Service Pension Fund will be deducted.)Description
VirtualStain is a project funded under thematic call for strategic funding by UiT The Arctic University of Norway. It involves developing AI solutions for segmenting, identity allocation, and modeling of the processes of subcellular structures such as mitochondria in cells and cellular structures in tissues using labelfree images and videos of cells and tissues. Interpreting life processes and labelfree images of cells and tissues is a daunting task. The PhD students will work on the following problem:Images of unlabeled samples appear as gray scale images devoid of color, texture, and edges. Therefore, they lack features conventionally used in deep models for identification of individual structures. New suitably designed and trained intelligence models have to be developed specific to the chosen labelfree imaging technology. If conventional AI approaches such as deep learning and generative networks are used, large training dataset with correlated image sets of labeled and labelfree images are needed, which is a significant challenge. There is a need of new outofbox AI solutions that derive and improve intelligence, as new data becomes available.
Project page  https://en.uit.no/project/virtualstain

GeoSciInfo
INTRODUCTION LECTURES

Introduction and presentation of the conferences by Frederic Barbaresco. VIDEO

Presentation of Geometric Sciences of Information and GSI 2021 by Frederic Barbaresco. VIDEO
LECTURES (90 min)
1. Langevin Dynamics
 1.1 Langevin Dynamics: old and news : Eric Moulines . Part 1 : introduction to Markov chain Monte Carlo Methods VIDEO, Part 2 VIDEO
2. Computational Information Geometry:

2.1. Information Manifold modeled with Orlicz Spaces : Giovanni Pistone . VIDEO

2.2. Recent contributions to Distances and Information Geometry: a computational viewpoint : Frank Nielsen . VIDEO  SLIDES
3. NonEquilibrium Thermodynamic Geometry
 3.1. A variational perspective of closed and open systems: François GayBalmaz
 3.2. Geometry of NonEquilibrium Thermodynamics: a homogeneous Symplectic approach : Arjan Van Der Schaft . VIDEO SLIDES
4. Geometric Mechanics

4.1. Galilean Mechanics and Thermodynamics of continua : Géry de Saxcé. VIDEO  SLIDES

4.2. SouriauCasimir Lie Groups Thermodynamics and Machine Learning : Frederic Barbaresco. VIDEO  SLIDES
5. "Structure des Systèmes Dynamiques" (SSD) JeanMarie Souriau’s book 50th Birthday Wikipedia page

5.1. Souriau Familly and "structure of motion": JeanMarie Souriau, Michel Souriau, Paul Souriau and Etienne Souriau : Frederic Barbaresco . VIDEO  SLIDES

5.2. SSD JeanMarie Souriau’s book 50th birthday: Géry de Saxcé SLIDES
KEYNOTES (60 min)

Learning Physics from Data : Francisco Chinesta . VIDEO VIDEO  SLIDES

Information Geometry and Integrable Systems : JeanPierre Françoise. VIDEO VIDEO  SLIDES

Learning with Few Labeled Data : Pratik Chaudhari . VIDEO  SLIDES

Information Geometry and Quantum Fields : Kevin Grosvenor SLIDES

Port Thermodynamic Systems Control : Bernhard Maschke . VIDEO  SLIDES

Dirac Structures in Nonequilibrium Thermodynamics : Hiroaki Yoshimura . VIDEO  SLIDES

Thermodynamic efficiency implies predictive inference : Susanne Still . VIDEO  SLIDES

Computational dynamics of reduced coupled multibodyfluid system in Lie group setting : Zdravko Terze . VIDEO  SLIDES

Exponential Family by Representation Theory : Koichi Tojo . VIDEO  SLIDES

Deep Learning as Optimal Control Problems and Structure Preserving Deep Learning : Elena Celledoni . VIDEO  SLIDES

Contact geometry and thermodynamical systems : Manuel de León. VIDEO  SLIDES

Mechanics of the probability simplex : Luigi Malagò. VIDEO  SLIDES

Covariant Momentum Map Thermodynamics : Goffredo Chirco. VIDEO  SLIDES

Sampling and statistical physics via symmetry : Steve Huntsman. VIDEO  SLIDES

Geometry of Measurepreserving Flows and Hamiltonian Monte Carlo : Alessandro Barp. VIDEO  SLIDES

Schroedinger's problem, HamiltonJacobiBellman equations and regularized Mass Transportation : JeanClaude Zambrini. VIDEO  SLIDES
POSTERS
PDF of posters: Viscoelastic flows of Maxwell fluids with conservation laws  Sébastien Boyaval  POSTER
 Bayesian Inference on Local Distributions of Functions and Multidimensional Curves with Spherical HMC Sampling  Anis Fradi and Chafik Samir  POSTER
 Material modeling via Thermodynamicsbased Artificial Neural Networks  Filippo Masi Ioannis Stefanou, Paolo Vannucci, Victor MaffiBerthier  POSTER
 LEARNING THE LOWDIMENSIONAL GEOMETRY OF THE WIRELESS CHANNEL  Paul Ferrand, Alexis Decurninge, Luis Garcia Ordóñez and Maxime Guillaud  POSTER
 A Hyperbolic approach for learning communities on graphs  Hatem Hajri, Thomas Gerald and Hadi Zaatiti  POSTER
 UNSUPERVISED OBJECT DETECTION FOR TRAFFIC SCENE ANALYSIS  Bruno Sauvalle (superviseur: ARNAUD DE LA FORTELLE)  POSTER
 Hard ShapeConstrained Kernel Regression  PierreCyril AubinFrankowski and Zoltán Szabó  POSTER
 CONSTRAINTBASED REGULARIZATION OF NEURAL NETWORKS  Benedict Leimkuhler, Timothée Pouchon, Tiffany Vlaar, Amos Storkey  POSTER
 CONNECTING STOCHASTIC OPTIMIZATION WITH SCHRÖDINGER EVOLUTION WITH RESPECT TO NON HERMITIAN HAMILTONIANS  C. Couto, J. Mourão, J.P. Nunes and P. Ribeiro  POSTER
 Geomstats: A Python Package for Geometry in Machine Learning and Information Geometry  Nina Miolane, Nicolas Guigui1, Alice Le Brigant, Hadi Zaatiti, Christian Shewmake, Hatem Hajri, Johan Mathe, Benjamin Hou, Yann Thanwerdas, Stefan Heyder, Olivier Peltre, Niklas Koep, Yann Cabanes, Thomas Gerald, Paul Chauchat, Daniel Brooks, Bernhard Kainz, Claire Donnat, Susan Holmes, Xavier Pennec  POSTER
 Fast Highorder Tensor Learning Based on Grassmann Manifold  O.KARMOUDA, R.BOYER and J.BOULANGER  POSTER
 A Geometric Interpretation of Stochastic Gradient Descent in Deep Learning and Bolzmann Machines  Rita Fioresi and Pratik Chaudhari  POSTER
 Lagrangian and Hamiltonian Dynamics on the Simplex  Goffredo Chirco, Luigi Malago, Giovanni Pistone  POSTER
 Calibrating Bayesian Neural Networks with Alphadivergences and Normalizing Flows  Hector J. Hortua, Luigi Malago and Riccardo Volpi  POSTER


GeoSciInfo
Registration payment:
Registration fees for Summer Week is 450 euros, including catering (bedroom and 3 meals a dayon 5 days) and all accommodation on site: https://www.houchesschoolphysics.com/practicalinformation/facilities/ https://www.houchesschoolphysics.com/practicalinformation/yourstay/
Registration will be paid at Les Houches reception desk at your arrival by credit card (or VAD payment of your lab).
Any registration canceled less than two weeks before the arrival date will be due.Arrival/Departure:
The arrival is Sunday July 26th starting from 3:00 pm. On the day of arrival, only the evening meal is planned. On Sunday, the secretariat is open from 6:00 pm to 7:30 pm. Summer Week will be closed Friday July 31st at 4 pm.
Access to Les Houches:
https://www.houchesschoolphysics.com/practicalinformation/access/
Ecole de Physique des Houches, 149 Chemin de la Côte, F74310 Les Houches, France Les Houches is a village located in Chamonix valley, in the French Alps. Established in 1951, the Physics School is situated at 1150 m above sea level in natural surroundings, with breathtaking views on the MontBlanc mountain range.https://houchesschoolphysics.com
Excursion:
Wednesday afternoon is free. Excursion could be organized to
· The Mer de Glace (Sea of Ice): It is the largest glacier in France, 7 km long and 200m deep and is one of the biggest attractions in the Chamonix Valley: https://www.chamonix.net/english/leisure/sightseeing/merdeglace
· L’Aiguille du midi: From its height of 3,777m, the Aiguille du Midi and its laidout terraces offer a 360° view of all the French, Swiss and Italian Alps. A lift brings you to the summit terrace at 3,842m, where you will have a clear view of Mont Blanc: https://www.chamonix.com/aiguilledumidistepintothevoid,80,en.html