Drag and Drop a photo, drag to position, and hit Save

Group Details

Geometric Science of Information

The objective of this group is to bring together pure/applied mathematicians, physicist and engineers, with common interest for Geometric tools and their applications. It notably aim to organize conferences and to promote collaborative european and international research projects, and diffuse research results on the related domains. It aims to organise conferences, seminar, to promote collaborative local, european and international research project, and to diffuse research results in the the different related interested domains.

  • Geo-Sci-Info



    Company presentation
    Since 2002, Median Technologies has been expanding the boundaries of the identification, interpretation, analysis and reporting of imaging data in the medical world. Our core activity is to develop advanced imaging software solutions and platforms for clinical drug development in oncology, diagnostic support, and cancer patient care. Our software solutions improve the management of cancer patients by helping to better identify pathologies, develop and select patient-specific therapies (precision medicine).

    The company employs a highly-qualified team and leverages its scientific, technical, medical, and regulatory expertise to develop innovative medical imaging analysis software based on Artificial Intelligence, cloud computing and big data. We are driven by our core values that are essential to us. These values define who we are, what we do, the way we do it, and what we, as Median, aspire to:
    • Leading innovation with purpose
    • Committing to quality in all we do
    • Supporting our customers in achieving their goals
    • Always remembering to put the patient first

    Today, we are a team of 130+ people. Most of us are based at our HQ, in Sophia Antipolis (French Riviera) and we have a subsidiary in the US and another one in China. Our company is growing in a fulfilling international and multicultural environment.

    Job description
    In the context of our research and development in artificial intelligence applied to medical imaging, we are looking for: Data Science and Machine Learning Research Scientist M/F

    Integrated into a multidisciplinary research and development team within the iBiopsy® project, you are a scientist in the research and development of innovative medical imaging solutions using machine learning and other AI methods.

    Medical imaging is one of the fastest growing fields in machine learning. We are looking for an enthusiastic, dynamic, and organized Data Scientist with strong ML experience, excellent communication skills who will thrive at the heart of technological innovation.

    o Position under the supervision of Head of Data Science

    o Responsibilities:

    1. You will apply your AI/ML/Deep Learning knowledge to develop innovative and robust biomarkers using data coming from medical imaging systems such as MRI and CT scanners and other data sources.

    2. Your work will involve research and development of novel machine learning algorithms and systems. Being part of our front-end innovation organization, you will actively scout, keep track of, evaluate, and leverage disruptive technologies, and emerging industrial, academic and technological trends.

    3. You will work closely with iBiopsy’s software development team as well as clinical science team.

    4. In addition, you will transfer technology, and share insights and best practices across innovation teams. You will generate intellectual property for the company. You will be expected to author peer reviewed papers, present results at industry/scientific conferences.

    5. We look at you to building breakthrough AI-enabled imaging solutions leveraging cloud computing and apply supervised and unsupervised Machine Learning techniques to create value from the imaging and clinical data repositories generated by our medical research and pharmaceutical industry partners. These AI enabled systems and services go beyond image analysis to transform medical practice and drug development.

    Profile required
    o Education: PhD in in Mathematics, Computer Science or related fields

    o Main skills and Experience required:
    • Minimum 3 years of relevant work experience in (deep) machine learning
    • Experience with Medical Imaging, CT/MRI, image signatures, large scale visual information retrieval, features selection
    • Relevant experience with Python, DL frameworks (i.e. Pytorch) and standard packages such as Scikit-learn, Numpy, Scipy, Pandas
    • Semi-Supervised Learning, Self-supervised Learning, Reinforcement Learning, Adversarial methods.
    • Multimodal feature extraction
    • Author on related research publication / conferences
    • Strong experience with opensource technologies to accelerate innovation

    • In depth technical knowledge of AI, deep learning and computer vision
    • Strong fundamental knowledge of statistical data processing, regression techniques, neural networks, decision trees, clustering, pattern recognition, probability theory, stochastic systems, Bayesian inference, statistical techniques and dimensionality reduction

    Additional qualities:
    • Strong interpersonal, communication and presentation skills as well as ability to work in global team
    • Fluent in written and oral English


    posted in Jobs offers - Call for projects read more
  • Geo-Sci-Info

    Bourses doctorales : CNRS et University of Tokyo

    Le CNRS et the University of Tokyo financeront des bourses doctorales en sciences humaines et sociales, intelligence artificielle, science quantique, changement climatique et biologie moléculaire et cellulaire. Date limite des candidatures le 22 avril 2021.

    Pour candidater :

    posted in Jobs offers - Call for projects read more
  • Geo-Sci-Info

    We offer in total 8 funded PhD positions associated with the graduate sat the interface between Cognitive Science, Machine learning, Computational Neuroscience, and AI. chool on "Computational Cognition » (,

    The RTG Computational Cognition aims at reintegrating Cognitive Science and Artificial Intelligence. PhD students of the RTG will be educated in both subjects in order to combine the findings of these fields and thus to get a better understanding of human and machine intelligence. Research fields involved in the RTG are Neuroinformatics, NeuroBioPsychology, Bio-Inspired Computer Vision, Knowledge-Based Systems, Cognitive Natural Language Processing & Communication, Cognitive Modeling, Artificial Intelligence, Psycho/Neurolinguistics, Computational Linguistics and Cognitive Computing.
    The RTG focuses on the integration of two research fields. Further information on the RTG is available at Detailed information on the core areas of the offered PhD projects can be obtained from the spokesmen of the RTG, Prof. Dr. Gordon Pipa (gpipa[at] and Prof. Dr. Peter König (pkoenig[at]

    The RTG is incorporated into the Cognitive Science PhD program founded in 2002. PhD students of the RTG will take advantage of an interdisciplinary environment, which nevertheless focuses on a common research topic and offers a broad range of methodological synergies between the projects.

    Required Qualifications:
    Applicants are expected to have an academic degree (Master/Diploma), experience in at least one of the domains listed above, proven experience in interdisciplinary work as well as a good command of the English language.

    Osnabrück University is committed to helping working/studying parents balance their family and working lives.

    Osnabrück University seeks to guarantee equality of opportunity for women and men and strives to correct any gender imbalance in its schools and departments.

    If two candidates are equally qualified, preference will be given to the candidate with disability status.

    Applications with the usual documentation should be submitted by e-mail in a single PDF file to the director of the Institute of Cognitive Science, Prof. Dr. Gunther Heidemann (gheidema[at] with a cc to office[at] no later than April 19, 2021.

    To inquire additional information on for example specific research projects you can contact the coordinator Gabriela Pipa (gapipa[at]

    Professor and Chair of the Neuroinformatics Department
    Dr. rer. nat. Gordon Pipa
    Institute of Cognitive Science, Room 50/218
    University of Osnabrueck
    Wachsbleiche 27, 49090 Osnabrück, Germany

    tel. +49 (0) 541-969-2277
    fax (private). +49 (0) 5405- 500 80 98
    home office. +49 (0) 5405- 500 90 95
    e-mail: gpipa[at]
    research gate:

    Personal Assistent and Secretary of the Neuroinformatic lab:
    Anna Jungeilges
    Tel. +49 (0)541 969-2390
    Fax +49 (0)541 969-2246
    Email: anna.jungeilges[at]
    visit us on!/CogSciUOS

    posted in Jobs offers - Call for projects read more
  • Geo-Sci-Info

    Statistics, Information and Topology

    Co-chairs of the session:

    • Michel N'Guiffo Boyom: Université Toulouse
    • Pierre Baudot: Median (link)

    This session will focus on the advances of information theory, probability and statistics in Algebraic Topology (see [1-56] bellow). The field is currently knowing an impressive development, both on the side of the categorical, homotopical, or topos foundations of probability theorie and statistics, and of the information functions characterisation in cohomology and homotopy theory.

    Bliographicical references: (to be completed)

    [1] Cencov, N.N. Statistical Decision Rules and Optimal Inference. Translations of Mathematical Monographs. 1982.
    [2] Ay, N. and Jost, J. and Lê, H.V. and Schwachhöfer, L. Information geometry and sufficient statistics. Probability Theory and Related Fields 2015 PDF
    [3] Cathelineau, J. Sur l’homologie de sl2 a coefficients dans l’action adjointe, Math. Scand., 63, 51-86, 1988. PDF
    [4] Kontsevitch, M. The 1+1/2 logarithm. Unpublished note, Reproduced in Elbaz-Vincent & Gangl, 2002, 1995 PDF
    [5] Elbaz-Vincent, P., Gangl, H. On poly(ana)logs I., Compositio Mathematica, 130(2), 161-214. 2002. PDF
    [6] Tomasic, I., Independence, measure and pseudofinite fields. Selecta Mathematica, 12 271-306. Archiv. 2006.
    [7] Connes, A., Consani, C., Characteristic 1, entropy and the absolute point. preprint arXiv:0911.3537v1. 2009.
    [8] Marcolli, M. & Thorngren, R. Thermodynamic Semirings, arXiv 10.4171 / JNCG/159, Vol. abs/1108.2874, 2011.
    [9] Abramsky, S., Brandenburger, A., The Sheaf-theoretic structure of non-locality and contextuality, New Journal of Physics, 13 (2011). PDF
    [10] Gromov, M. In a Search for a Structure, Part 1: On Entropy, unpublished manuscript, 2013. PDF
    [11] McMullen, C.T., Entropy and the clique polynomial, 2013. PDF
    [12] Marcolli, M. & Tedeschi, R. Entropy algebras and Birkhoff factorization, arXiv, Vol. abs/1108.2874, 2014.
    [13] Doering, A., Isham, C.J., Classical and Quantum Probabilities as Truth Values, arXiv:1102.2213, 2011 PDF
    [14] Baez, J.; Fritz, T. & Leinster, T. A Characterization of Entropy in Terms of Information Loss Entropy, 13, 1945-1957, 2011. PDF
    [15] Baez J. C.; Fritz, T. A Bayesian characterization of relative entropy. Theory and Applications of Categories, Vol. 29, No. 16, p. 422-456. 2014. PDF
    [16] Drummond-Cole, G.-C., Park., J.-S., Terrila, J., Homotopy probability theory I. J. Homotopy Relat. Struct. November 2013. PDF
    [17] Drummond-Cole, G.-C., Park., J.-S., Terrila, J., Homotopy probability theory II. J. Homotopy Relat. Struct. April 2014. PDF
    [18] Burgos Gil J.I., Philippon P., Sombra M., Arithmetic geometry of toric varieties. Metrics, measures and heights, Astérisque 360. 2014 . PDF
    [19] Gromov, M. Symmetry, probability, entropy. Entropy 2015. PDF
    [20] Gromov, M. Morse Spectra, Homology Measures, Spaces of Cycles and Parametric Packing Problems, april 2015. PDF
    [21] Park., J.-S., Homotopy theory of probability spaces I: classical independence and homotopy Lie algebras. Archiv . 2015
    [22] Baudot P., Bennequin D. The homological nature of entropy. Entropy, 17, 1-66; 2015. PDF
    [23] Elbaz-Vincent, P., Gangl, H., Finite polylogarithms, their multiple analogues and the Shannon entropy. (2015) Vol. 9389 Lecture Notes in Computer Science. 277-285, Archiv.
    [24] M. Marcolli, Information algebras and their applications. International Conference on Geometric Science of Information (2015), 271-276
    [25] Abramsky S., Barbosa R.S., Lal K.K.R., Mansfield, S., Contextuality, Cohomology and Paradox. 2015. arXiv:1502.03097
    [26] M. Nguiffo Boyom, Foliations-Webs-Hessian Geometry-Information Geometry-Entropy and Cohomology. Entropy 18(12): 433 (2016) PDF
    [27] M. Nguiffo Boyom, A. Zeglaoui, Amari Functors and Dynamics in Gauge Structures. GSI 2017: 170-178
    [28] G.-C. Drummond-Cole, Terila, Homotopy probability theory on a Riemannian manifold and the Euler equation , New York Journal of Mathematics, Volume 23 (2017) 1065-1085. PDF
    [29] P. Forré,, JM. Mooij. Constraint-based Causal Discovery for Non-Linear Structural Causal Models with Cycles and Latent Confounders. In A. Globerson, & R. Silva (Eds.) (2018), pp. 269-278)
    [30] T. Fritz and P. Perrone, Bimonoidal Structure of Probability Monads. Proceedings of MFPS 34, ENTCS, (2018). PDF
    [31] Jae-Suk Park, Homotopical Computations in Quantum Fields Theory, (2018) arXiv:1810.09100 PDF
    [32] G.C. Drummond-Cole, An operadic approach to operator-valued free cumulants. Higher Structures (2018) 2, 42–56. PDF
    [33] G.C. Drummond-Cole, A non-crossing word cooperad for free homotopy probability theory. MATRIX Book (2018) Series 1, 77–99. PDF
    [34] T. Fritz and P. Perrone, A Probability Monad as the Colimit of Spaces of Finite Samples, Theory and Applications of Categories 34, 2019. PDF.
    [35] M. Esfahanian, A new quantum probability theory, quantum information functor and quantum gravity. (2019) PDF
    [36] T. Leinster, Entropy modulo a prime, (2019) arXiv:1903.06961 PDF
    [37] T. Leinster, E. Roff, The maximum entropy of a metric space, (2019) arXiv:1908.11184 PDF
    [38] T. Maniero, Homological Tools for the Quantum Mechanic. arXiv 2019, arXiv:1901.02011. PDF
    [39] M. Marcolli, Motivic information, Bollettino dell'Unione Matematica Italiana (2019) 12 (1-2), 19-41
    [40] J.P. Vigneaux, Information theory with finite vector spaces, in IEEE Transactions on Information Theory, vol. 65, no. 9, pp. 5674-5687, Sept. (2019)
    [41] Baudot P., Tapia M., Bennequin, D. , Goaillard J.M., Topological Information Data Analysis. (2019), Entropy, 21(9), 869
    [42] Baudot P., The Poincaré-Shannon Machine: Statistical Physics and Machine Learning aspects of Information Cohomology. (2019), Entropy , 21(9),
    [43] G. Sergeant-Perthuis, Bayesian/Graphoid intersection property for factorisation models, (2019), arXiv:1903.06026
    [44] J.P. Vigneaux, Topology of Statistical Systems: A Cohomological Approach to Information Theory, PhD Thesis (2019).
    [45] Forré, P., & Mooij, J. M. (2019). Causal Calculus in the Presence of Cycles, Latent Confounders and Selection Bias. In A. Globerson, & R. Silva (Eds.), Proceedings of the Thirty-Fifth Conference on Uncertainty in Artificial Intelligence: UAI 2019, (2019)
    [46] Y. Manin, M. Marcolli Homotopy Theoretic and Categorical Models of Neural Information Networks. arXiv (2020) preprint arXiv:2006.15136
    [47] T. Leinster The categorical origins of Lebesgue integration (2020) arXiv:2011.00412 PDF
    [48] T. Fritz, T. Gonda, P. Perrone, E. Fjeldgren Rischel, Representable Markov Categories and Comparison of Statistical Experiments in Categorical Probability. (2020) arXiv:2010.07416 PDF
    [49] T. Fritz, E. Fjeldgren Rischel, Infinite products and zero-one laws in categorical probability (2020) arXiv:1912.02769 PDF
    [50] T. Fritz, A synthetic approach to Markov kernels, conditional independence and theorems on sufficient statistics (2020) arXiv:1908.07021 PDF
    [51] T. Fritz and P. Perrone, Stochastic Order on Metric Spaces and the Ordered Kantorovich Monad, Advances in Mathematics 366, 2020. PDF
    [52] T. Fritz and P. Perrone, Monads, partial evaluations, and rewriting. Proceedings of MFPS 36, ENTCS, 2020. PDF.
    [53] D. Bennequin. G. Sergeant-Perthuis, O. Peltre, and J.P. Vigneaux, Extra-fine sheaves and interaction decompositions, (2020) arXiv:2009.12646
    [54] J.P. Vigneaux, Information structures and their cohomology, in Theory and Applications of Categories, Vol. 35, (2020), No. 38, pp 1476-1529.
    [55] O. Peltre, Message-Passing Algorithms and Homology, PhD Thesis (2020), arXiv:2009.11631
    [56] G. Sergeant-Perthuis, Interaction decomposition for presheafs, (2020) arXiv:2008.09029
    [57] K. Hess, Topological adventures in neuroscience, in the Proceedings of the 2018 Abel Symposium: Topological Data Analysis, Springer Verlag, (2020).
    [58] C. Curto, N. Youngs. Neural ring homomorphisms and maps between neural codes. Submitted. preprint.
    [59] N.C. Combe, Y, Manin, F-manifolds and geometry of information, arXiv:2004.08808v.2, (2020) Bull. London MS.
    [60] Abramsky, S. , Classical logic, classical probability, and quantum mechanics 2020 arXiv:2010.13326

    posted in Sessions GSI2021 read more
  • Geo-Sci-Info

    Topology and geometry in neuroscience

    chairs of the sessions

    • Giovani Petri: ISI Foundation link
    • Pierre Baudot: Median link

    This session will focus on the advances on Algebraic Topology and Geometrical methods in neurosciences (see [1-105] bellow, among many others). The field is currently knowing an impressive development coming both:

    _ from theoretical neuroscience and machine learning fields, like Graph Neural Networks [30-42], Bayesian geometrical inference [27-29], Message Passing, probability and cohomology [92-95], Information Topology [53-54,62-66,96-105] or Networks [83-85,90-91], higher order n-body statistical interactions [67,74,94-95,99,101]

    _ from topological data analysis applications to real neural recordings, ranging from subcellular [43,51] genetic or omic expressions [81,101], spiking dynamic and neural coding [1-25,45-47,50-52,79], to cortical areas fMRI, EEG [26,67-72,76-80,84-89], linguistic [54-61] and consciousness [48,53,102].

    Bibliographical references: (to be completed)

    Carina Curto, Nora Youngs and Vladimir Itskov and colleagues:

    [1] C. Curto, N. Youngs. Neural ring homomorphisms and maps between neural codes. Submitted. preprint.
    [2] C. Curto, J. Geneson, K. Morrison. Fixed points of competitive threshold-linear networks. Neural Computation, in press, 2019. preprint.
    [3] C. Curto, A. Veliz-Cuba, N. Youngs. Analysis of combinatorial neural codes: an algebraic approach. Book chapter in Algebraic and Combinatorial Computational Biology. R. Robeva, M. Macaulay (Eds), 2018.
    [4] C. Curto, V. Itskov. Combinatorial neural codes. Handbook of Discrete and Combinatorial Mathematics, Second Edition, edited by Kenneth H. Rosen, CRC Press, 2018. pdf
    [5] C. Curto, E. Gross, J. Jeffries, K. Morrison, M. Omar, Z. Rosen, A. Shiu, N. Youngs. What makes a neural code convex? SIAM J. Appl. Algebra Geometry, vol. 1, pp. 222-238, 2017. pdf, SIAGA link, and preprint
    [6] C. Curto. What can topology tells us about the neural code? Bulletin of the AMS, vol. 54, no. 1, pp. 63-78, 2017. pdf, Bulletin link.
    [7] C. Curto, K. Morrison. Pattern completion in symmetric threshold-linear networks. Neural Computation, Vol 28, pp. 2825-2852, 2016. pdf, preprint.
    [8] C. Giusti, E. Pastalkova, C. Curto, V. Itskov. Clique topology reveals intrinsic geometric structure in neural correlations. PNAS, vol. 112, no. 44, pp. 13455-13460, 2015. pdf, PNAS link.
    [9] C. Curto, A. Degeratu, V. Itskov. Encoding binary neural codes in networks of threshold-linear neurons. Neural Computation, Vol 25, pp. 2858-2903, 2013. pdf, preprint.
    [10] K. Morrison, C. Curto. Predicting neural network dynamics via graphical analysis. Book chapter in Algebraic and Combinatorial Computational Biology. R. Robeva, M. Macaulay (Eds), 2018. preprint,
    [11] C. Curto, V. Itskov, A. Veliz-Cuba, N. Youngs. The neural ring: an algebraic tool for analyzing the intrinsic structure of neural codes. Bulletin of Mathematical Biology, Volume 75, Issue 9, pp. 1571-1611, 2013. preprint.
    [12] C. Curto, V. Itskov, K. Morrison, Z. Roth, J.L. Walker. Combinatorial neural codes from a mathematical coding theory perspective. Neural Computation, Vol 25(7):1891-1925, 2013. preprint.
    [13] C. Curto, A. Degeratu, V. Itskov. Flexible memory networks. Bulletin of Mathematical Biology, Vol 74(3):590-614, 2012. preprint.
    [14] V. Itskov, C. Curto, E. Pastalkova, G. Buzsaki. Cell assembly sequences arising from spike threshold adaptation keep track of time in the hippocampus. Journal of Neuroscience, Vol. 31(8):2828-2834, 2011.
    [15] K.D. Harris, P. Bartho, et al.. How do neurons work together? Lessons from auditory cortex. Hearing Research, Vol. 271(1-2), 2011, pp. 37-53.
    [16] P. Bartho, C. Curto, A. Luczak, S. Marguet, K.D. Harris. Population coding of tone stimuli in auditory cortex: dynamic rate vector analysis. European Journal of Neuroscience, Vol. 30(9), 2009, pp. 1767-1778.
    [17] C. Curto, V. Itskov. Cell groups reveal structure of stimulus space. PLoS Computational Biology, Vol. 4(10): e1000205, 2008.
    [18] E. Gross , N. K. Obatake , N. Youngs, Neural ideals and stimulus space visualization, Adv. Appl.Math., 95 (2018), pp. 65–95.
    [19] C. Giusti, V. Itskov. A no-go theorem for one-layer feedforward networks. Neural Computation, 26 (11):2527-2540, 2014.
    [20] V. Itskov, L.F. Abbott. Capacity of a Perceptron for Sparse Discrimination . Phys. Rev. Lett. 101(1), 2008.
    [21] V. Itskov, E. Pastalkova, K. Mizuseki, G. Buzsaki, K.D. Harris. Theta-mediated dynamics of spatial information in hippocampus. Journal of Neuroscience, 28(23), 2008.
    [22] V. Itskov, C. Curto, K.D. Harris. Valuations for spike train prediction. Neural Computation, 20(3), 644-667, 2008.
    [23] E. Pastalkova, V. Itskov , A. Amarasingham , G. Buzsaki. Internally Generated Cell Assembly Sequences in the Rat Hippocampus. Science 321(5894):1322 - 1327, 2008.
    [24] V. Itskov, A. Kunin, Z. Rosen. Hyperplane neural codes and the polar complex. To appear in the Abel Symposia proceedings, Vol. 15, 2019.

    Alexander Ruys de Perez and colleagues:

    [25] A. Ruys de Perez, L.F. Matusevich, A. Shiu, Neural codes and the factor complex, Advances in Applied Mathematics 114 (2020).

    Sunghyon Kyeong and colleagues:

    [26] Sunghyon Kyeong, Seonjeong Park, Keun-Ah Cheon, Jae-Jin Kim, Dong-Ho Song, and Eunjoo Kim, A New Approach to Investigate the Association between Brain Functional Connectivity and Disease Characteristics of Attention-Deficit/Hyperactivity Disorder: Topological Neuroimaging Data Analysis, PLOS ONE, 10 (9): e0137296, DOI: 10.1371/journal.pone.0137296 (2015)

    Jonathan Pillow and colleagues:

    [27] Aoi MC & Pillow JW (2017). Scalable Bayesian inference for high-dimensional neural receptive fields. bioRxiv 212217; doi:
    [28] Aoi MC, Mante V, & Pillow JW. (2020). Prefrontal cortex exhibits multi-dimensional dynamic encoding during decision-making. Nat Neurosci.
    [29] Calhoun AJ, Pillow JW, & Murthy M. (2019). Unsupervised identification of the internal states that shape natural behavior. Nature Neuroscience 22:2040-20149.
    [30] Dong X, Thanou D, Toni L, et al., 2020, Graph Signal Processing for Machine Learning: A Review and New Perspectives, Ieee Signal Processing Magazine, Vol:37, ISSN:1053-5888, Pages:117-127

    Michael Bronstein, Federico Monti, Giorgos Bouritsas and colleagues:

    [31] G. Bouritsas, F. Frasca, S Zafeiriou, MM Bronstein, Improving graph neural network expressivity via subgraph isomorphism counting. arXiv (2020) preprint arXiv:2006.09252
    [32] M. Bronstein , G. Pennycook, L. Buonomano, T.D. Cannon, Belief in fake news, responsiveness to cognitive conflict, and analytic reasoning engagement, Thinking and Reasoning (2020), ISSN: 1354-6783
    [33] X. Dong, D. Thanou, L. Toni, M. Bronstein, P. Frossard, Graph Signal Processing for Machine Learning: A Review and New Perspectives, IEEE Signal Processing Magazine (2020), Vol: 37, Pages: 117-127, ISSN: 1053-5888
    [34] Y. Wang, Y. Sun, Z. Liu, S.E. Sarma, M. Bronstein, J.M. Solomon, Dynamic Graph CNN for Learning on Point Clouds, ACM Transactions on graphics (2020), Vol: 38, ISSN: 0730-0301
    [35] M. Bronstein, J. Everaert, A. Castro, J. Joormann, T. D. Cannon, Pathways to paranoia: Analytic thinking and belief flexibility., Behav Res Ther (2019), Vol: 113, Pages: 18-24
    [36] G. Bouritsas, S. Bokhnyak, S. Ploumpis, M. Bronstein, S. Zafeiriou, Neural 3D Morphable Models: Spiral Convolutional Networks for 3D Shape Representation Learning and Generation, (2019) IEEE/CVF ICCV 2019, 7212
    [37] O. Litany, A. Bronstein, M. Bronstein, A. Makadia et al., Deformable Shape Completion with Graph Convolutional Autoencoders (2018), Pages: 1886-1895, ISSN: 1063-6919
    [38] R. Levie, F. Monti, X. Bresson X, M. Bronstein, CayleyNets: Graph Convolutional Neural Networks with Complex Rational Spectral Filters, IEEE Transactions on Signal Processing (2018), Vol: 67, Pages: 97-109, ISSN: 1053-587X
    [39] F. Monti, K. Otness, M. Bronstein, Motifnet: a motif-based graph convolutional network for directed graphs (2018), Pages: 225-228
    [40] F. Monti, M. Bronstein, X. Bresson, Geometric matrix completion with recurrent multi-graph neural networks, Neural Information Processing Systems (2017), Pages: 3700-3710, ISSN: 1049-5258
    [41] F. Monti F, D. Boscaini, J. Masci, E. Rodola, J. Svoboda, M. Bronstein, Geometric deep learning on graphs and manifolds using mixture model CNNs, (2017) IEEE Conference on Computer Vision and Pattern Recognition, p: 3-3
    [42] M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, P. Vandergheynst et al., Geometric Deep Learning Going beyond Euclidean data, IEEE Signal Processing Magazine (2017), Vol: 34, Pages: 18-42, ISSN: 1053-5888

    Kathryn Hess and colleagues:

    [43] L. Kanari, H. Dictus, W. Van Geit, A. Chalimourda, B. Coste, J. Shillcock, K. Hess, and H. Markram, Computational synthesis of cortical dendritic morphologies, bioRvix (2020) 10.1101/2020.04.15.040410, submitted.
    [44] G. Tauzin, U. Lupo, L. Tunstall, J. Burella Prez, M. Caorsi, A. Medina-Mardones, A, Dassatti, and K. Hess, giotto-tda: a topological data analysis toolkit for machine learning and data exploration, arXiv:2004.02551
    [45] E. Mullier, J. Vohryzek, A. Griffa, Y. Alemàn-Gómez, C. Hacker, K. Hess, and P. Hagmann, Functional brain dynamics are shaped by connectome n-simplicial organization, (2020) submitted.
    [46] M. Fournier, M. Scolamiero, etal., Topology predicts long-term functional outcome in early psychosis, Molecular Psychiatry (2020).
    [47] K. Hess, Topological adventures in neuroscience, in the Proceedings of the 2018 Abel Symposium: Topological Data Analysis, Springer Verlag, (2020).
    [48] A. Doerig, A. Schurger, K. Hess, and M. H. Herzog, The unfolding argument: why IIT and other causal structure theories of consciousness are empirically untestable, Consciousness and Cognition 72 (2019) 49-59.
    [49] L. Kanari, S. Ramaswamy, et al., Objective classification of neocortical pyramidal cells, Cerebral Cortex (2019) bhy339,
    [50] J.-B. Bardin, G. Spreemann, K. Hess, Topological exploration of artificial neuronal network dynamics, Network Neuroscience (2019)
    [51] L. Kanari, P. Dłotko, M. Scolamiero, R. Levi, J. C. Shillcock, K. Hess, and H. Markram, A topological representation of branching morphologies, Neuroinformatics (2017) doi: 10.1007/s12021-017-9341-1.
    [52] M. W. Reimann, M. Nolte,et al., Cliques of neurons bound into cavities provide a missing link between structure and function, Front. Comput. Neurosci., 12 June (2017), doi: 10.3389/fncom.2017.00048.

    Mathilde Marcoli, Yuri Manin, and colleagues:

    [53] Y. Manin, M. Marcolli Homotopy Theoretic and Categorical Models of Neural Information Networks. arXiv (2020) preprint arXiv:2006.15136
    [54] M. Marcolli, Lumen Naturae: Visions of the Abstract in Art and Mathematics, MIT Press (2020)
    [55] A. Port, T. Karidi, M. Marcolli, Topological Analysis of Syntactic Structures (2019) arXiv preprint arXiv:1903.05181
    [56] M. Marcolli, Motivic information, Bollettino dell'Unione Matematica Italiana (2019) 12 (1-2), 19-41
    [57] A. Port, I. Gheorghita, D. Guth, J.M. Clark, C. Liang, S. Dasu, M. Marcolli, Persistent topology of syntax, Mathematics in Computer Science (2018) 12 (1), 33-50 20
    [58] K. Shu, S. Aziz, VL Huynh, D Warrick, M Marcolli, Syntactic phylogenetic trees, Foundations of Mathematics and Physics One Century After Hilbert (2018), 417-441
    [59] K. Shu, A. Ortegaray, R Berwick, M Marcolli Phylogenetics of Indo-European language families via an algebro-geometric analysis of their syntactic structures. arXiv (2018) preprint arXiv:1712.01719
    [60] K. Shu, M. Marcolli, Syntactic structures and code parameters Mathematics in Computer Science (2018) 11 (1), 79-90
    [61] K Siva, J Tao, M Marcolli. Syntactic Parameters and Spin Glass Models of Language Change Linguist. Anal (2017) 41 (3-4), 559-608
    [62] M. Marcolli, N. Tedeschi, Entropy algebras and Birkhoff factorization. Journal of Geometry and Physics (2015) 97, 243-265
    [63] M. Marcolli, Information algebras and their applications. International Conference on Geometric Science of Information (2015), 271-276
    [64] K. Siva, J. Tao, M. Marcolli Spin glass models of syntax and language evolution, arXiv preprint (2015) arXiv:1508.00504
    [65] Y. Manin, M. Marcolli, Kolmogorov complexity and the asymptotic bound for error-correcting codes Journal of Differential Geometry (2014) 97 (1), 91-108
    [66] M. Marcolli, R. Thorngren, Thermodynamic semirings, ArXiv preprint (2011) arXiv:1108.2874

    Bosa Tadić and colleagues:

    [67] M. Andjelkovic, B. Tadic, R. Melnik, The topology of higher-order complexes associated with brain-function hubs in human connectomes , available on, published in Scientific Reports 10:17320 (2020)
    [68] B. Tadic, M. Andjelkovic, M. Suvakov, G.J. Rodgers, Magnetisation Processes in Geometrically Frustrated Spin Networks with Self-Assembled Cliques, Entropy 22(3), 336 (2020)
    [69] B. Tadic, M. Andjelkovic, R. Melnik, Functional Geometry of Human Connectomes published in ScientificReports Nature:ScientificReports 9:12060 (2019) previous version: Functional Geometry of Human Connectome and Robustness of Gender Differences, arXiv preprint arXiv:1904.03399 April 6, 2019
    [70] B. Tadic, M. Andjelkovic, M. Suvakov, Origin of hyperbolicity in brain-to-brain coordination networks, FRONTIERS in PHYSICS vol.6, ARTICLE{10.3389/fphy.2018.00007}, (2018) OA
    [71] B. Tadic, M. Andjelkovic, Algebraic topology of multi-brain graphs: Methods to study the social impact and other factors onto functional brain connections, in Proceedings of BELBI (2016)
    [72] B. Tadic, M. Andjelkovic, B.M. Boskoska, Z. Levnajic, Algebraic Topology of Multi-Brain Connectivity Networks Reveals Dissimilarity in Functional Patterns during Spoken Communications, PLOS ONE Vol 11(11), e0166787 (2016)
    [73] M. Mitrovic and B. Tadic, Search for Weighted Subgraphs on Complex Networks with MLM, Lecture Notes in Computer Science, Vol. 5102 pp. 551-558 (2008)

    Giovanni Petri, Francesco Vaccarino and collaborators:

    [74] F. Battiston, G. Cencetti, et al., Networks beyond pairwise interactions: structure and dynamics, Physics Reports (2020), arXiv:2006.01764
    [75] M. Guerra, A. De Gregorio, U. Fugacci, G. Petri, F. Vaccarino, Homological scaffold via minimal homology bases. arXiv (2020) preprint arXiv:2004.11606
    [76] J. Billings, R. Tivadar, M.M. Murray, B. Franceschiello, G. Petri, Topological Features of Electroencephalography are Reference-Invariant, bioRxiv 2020
    [77] J. Billings, M. Saggar, S. Keilholz, G. Petri, Topological Segmentation of Time-Varying Functional Connectivity Highlights the Role of Preferred Cortical Circuits, bioRxiv 2020
    [78] E. Ibáñez-Marcelo, L. Campioni, et al., Topology highlights mesoscopic functional equivalence between imagery and perception: The case of hypnotizability. NeuroImage (2019) 200, 437-449
    [79] P. Expert, L.D. Lord, M.L. Kringelbach, G. Petri. Topological neuroscience. Network Neuroscience (2019) 3 (3), 653-655
    [80] C. Geniesse, O. Sporns, G. Petri, M. Saggar, Generating dynamical neuroimaging spatiotemporal representations (DyNeuSR) using topological data analysis. Network Neuroscience (2019) 3 (3), 763-778
    [81] A. Patania, P. Selvaggi, M. Veronese, O. Dipasquale, P. Expert, G. Petri, Topological gene expression networks recapitulate brain anatomy and function. Network Neuroscience (2019) 3 (3), 744-762
    [82] E. Ibáñez‐Marcelo, L. Campioni, al.. Spectral and topological analyses of the cortical representation of the head position: Does hypnotizability matter? Brain and behavior (2018) 9 (6), e01277
    [83] G. Petri, A. Barrat, Simplicial activity driven model, Physical review letters 121 (22), 228301
    [84] A. Phinyomark, E. Ibanez-Marcelo, G. Petri. Resting-state fmri functional connectivity: Big data preprocessing pipelines and topological data analysis. IEEE Transactions on Big Data (2017) 3 (4), 415-428
    [85] G. Petri, S. Musslick, B. Dey, K. Ozcimder, D. Turner, N.K. Ahmed, T. Willke. Topological limits to parallel processing capability of network architectures. arXiv preprint (2017) arXiv:1708.03263
    [86] K. Ozcimder, B. Dey, S. Musslick, G. Petri, N.K. Ahmed, T.L. Willke, J.D. Cohen, A Formal Approach to Modeling the Cost of Cognitive Control, arXiv preprint (2017) arXiv:1706.00085
    [87] L.D. Lord, P. Expert, et al. , Insights into brain architectures from the homological scaffolds of functional connectivity networks, Frontiers in systems neuroscience (2016) 10, 85
    [88] J. Binchi, E. Merelli, M. Rucco, G. Petri, F. Vaccarino. jHoles: A Tool for Understanding Biological Complex Networks via Clique Weight Rank Persistent Homology. Electron. Notes Theor. Comput. Sci. (2014) 306, 5-18
    [89] G. Petri, P. Expert, F. Turkheimer, R. Carhart-Harris, D. Nutt, P.J. Hellyer et al., Homological scaffolds of brain functional networks. Journal of The Royal Society Interface (2014) 11 (101), 20140873
    [90] G. Petri, M. Scolamiero, I. Donato, F. Vaccarino, Topological strata of weighted complex networks. PloS one (2013) 8 (6), e66506
    [91] G. Petri, M. Scolamiero, I. Donato, ., Networks and cycles: a persistent homology approach to complex networks Proceedings of the european conference on complex systems (2013), 93-99

    Daniel Bennequin, Juan-Pablo Vigneaux, Olivier Peltre, Pierre Baudot and colleagues:

    [92] D. Bennequin. G. Sergeant-Perthuis, O. Peltre, and J.P. Vigneaux, Extra-fine sheaves and interaction decompositions, (2020) arXiv:2009.12646
    [93] O. Peltre, Message-Passing Algorithms and Homology, PhD Thesis (2020), arXiv:2009.11631
    [94] G. Sergeant-Perthuis, Interaction decomposition for presheafs, (2020) arXiv:2008.09029
    [95] G. Sergeant-Perthuis, Bayesian/Graphoid intersection property for factorisation models, (2019), arXiv:1903.06026
    [96] J.P. Vigneaux, Topology of Statistical Systems: A Cohomological Approach to Information Theory, PhD Thesis (2019).
    [97] J.P. Vigneaux, Information structures and their cohomology, in Theory and Applications of Categories, Vol. 35, (2020), No. 38, pp 1476-1529.
    [98] J.P. Vigneaux, Information theory with finite vector spaces, in IEEE Transactions on Information Theory, vol. 65, no. 9, pp. 5674-5687, Sept. (2019)
    [99] Baudot P., Tapia M., Bennequin, D. , Goaillard J.M., Topological Information Data Analysis. (2019), Entropy, 21(9), 869
    [100] Baudot P., The Poincaré-Shannon Machine: Statistical Physics and Machine Learning aspects of Information Cohomology. (2019), Entropy , 21(9),
    [101] Tapia M., Baudot P., et al. Neurotransmitter identity and electrophysiological phenotype are genetically coupled in midbrain dopaminergic neurons. Scientific Reports. (2018). BioArXiv168740
    [102] Baudot P., Elements of qualitative cognition: an Information Topology Perspective. Physics of Life Reviews. (2019) Arxiv. arXiv:1807.04520
    [103] Baudot P., Bennequin D., The homological nature of entropy. Entropy, (2015), 17, 1-66; doi:10.3390
    [104] D. Bennequin. Remarks on Invariance in the Primary Visual Systems of Mammals, pages 243–333. Neuromathematics of Vision Part of the series Lecture Notes in Morphogenesis Springer, 2014.
    [105] Baudot P., Bennequin D., Information Topology I and II. Random models in Neuroscience (2012)

    posted in Sessions GSI2021 read more
  • Geo-Sci-Info

    Informatics Institute, University of Amsterdam and Qualcomm Technologies
    ELLIS Board Member (European Laboratory for Learning and Intelligent Systems:

    Title: Exploring Quantum Statistics for Machine Learning

    Abstract: Quantum mechanics represents a rather bizarre theory of statistics that is very different from the ordinary classical statistics that we are used to. In this talk I will explore if there are ways that we can leverage this theory in developing new machine learning tools: can we design better neural networks by thinking about entangled variables? Can we come up with better samplers by viewing them as observations in a quantum system? Can we generalize probability distributions? We hope to develop better algorithms that can be simulated efficiently on classical computers, but we will naturally also consider the possibility of much faster implementations on future quantum computers. Finally, I hope to discuss the role of symmetries in quantum theories.

    Roberto Bondesan, Max Welling, Quantum Deformed Neural Networks, arXiv:2010.11189v1 [quant-ph], 21st October 2020 ;

    Jean PETITOT
    Directeur d'Études, Centre d'Analyse et de Mathématiques, Sociales, École des Hautes Études, Paris.
    Born in 1944, Jean Petitot is an applied mathematician interested in dynamical modeling in neurocognitive sciences. He is the former director of the CREA (Applied Epistemology Research Center) at the Ecole Polytechnique.

    Philisopher of science

    Title : The primary visual cortex as a Cartan engine

    Abstract: Cortical visual neurons detect very local geometric cues as retinal positions, local contrasts, local orientations of boundaries, etc. One of the main theoretical problem of low level vision is to understand how these local cues can be integrated so as to generate the global geometry of the images perceived, with all the well-known phenomena studied since Gestalt theory. It is an empirical evidence that the visual brain is able to perform a lot of routines belonging to differential geometry. But how such routines can be neurally implemented ? Neurons are « point-like » processors and it seems impossible to do differential geometry with them. Since the 1990s, methods of "in vivo optical imaging based on activity-dependent intrinsic signals" have made possible to visualize the extremely special connectivity of the primary visual areas, their “functional architectures.” What we called « Neurogeometry » is based on the discovery that these functional architectures implement structures such as the contact structure and the sub-Riemannian geometry of jet spaces of plane curves. For reasons of principle, it is the geometrical reformulation of differential calculus from Pfaff to Lie, Darboux, Frobenius, Cartan and Goursat which turns out to be suitable for neurogeometry.


    • Agrachev, A., Barilari, D., Boscain, U., A Comprehensive Introduction to Sub-Riemannian Geometry, Cambridge University Press, 2020.
    • Citti, G., Sarti, A., A cortical based model of perceptual completion in the roto-translation space, Journal of Mathematical Imaging and Vision, 24, 3 (2006) 307-326.
    • Petitot, J., Neurogéométrie de la vision. Modèles mathématiques et physiques des architectures fonctionnelles, Les Éditions de l'École Polytechnique, Distribution Ellipses, Paris, 2008.
    • Petitot, J., “Landmarks for neurogeometry”, Neuromathematics of Vision, (G. Citti, A. Sarti eds), Springer, Berlin, Heidelberg, 1-85,
    • Petitot,J., Elements of Neurogeometry. Functional Architectures of Vision, Lecture Notes in Morphogenesis, Springer, 2017.
    • Prandi, D., Gauthier, J.-P., A Semidiscrete Version of the Petitot Model as a Plausible Model for Anthropomorphic Image Reconstruction and Pattern Recognition,, 2017.

    Yvette Kosmann-Schwarzbach

    Professeur des universités honoraire ; former student of the Ecole normale supérieure Sèvres, 1960-1964; aggregation of mathematics, 1963; CNRS research associate, 1964-1969; doctorate in science, Lie derivatives of spinors, University of Paris, 1970 under supervision of André Lichnerowicz; lecturer then professor at the University of Lille (1970-1976 and 1982-1993), at Brooklyn College, New York (1979-1982), at the École polytechnique (1993-2006)

    Title: Structures of Poisson Geometry: old and new

    Abstract: How did the brackets that Siméon-Denis Poisson introduce in 1809 evolve into the Poisson geometry of the 1970's? What are Poisson groups and, more generally, Poisson groupoids? In what sense does Dirac geometry generalize Poisson geometry and why is it relevant for applications? I shall sketch the definition of these structures and try to answer these questions.


    • P. Libermann and C.-M. Marle, Symplectic Geometry and Analytical Mechanics, D. Reidel Publishing Company (1987).
    • J. E. Marsden and T. S. Ratiu, Introduction to Mechanics and Symmetry, Texts in Applied Mathematics 17, second edition, Springer (1998).
    • C. Laurent-Gengoux, A. Pichereau, and P. Vanhaecke, Poisson Structures, Grundlehren der mathematischen Wissenschaften 347, Springer (2013).
    • Y. Kosmann-Schwarzbach, Multiplicativity from Lie groups to generalized geometry, in Geometry of Jets and Fields (K. Grabowska et al., eds), Banach Center Publications 110, 2016.
    • Special volume of LMP on Poisson Geometry, guest editors, Anton Alekseev, Alberto Cattaneo, Y. Kosmann-Schwarzbach, and Tudor Ratiu, Letters in Mathematical Physics 90, 2009.
    • Y. Kosmann-Schwarzbach (éd.), Siméon-Denis Poisson : les Mathématiques au service de la science, Editions de l'Ecole Polytechnique (2013).
    • Y. Kosmann-Schwarzbach, The Noether Theorems: Invariance and Conservation Laws in the Twentieth Century, translated by B. E. Schwarzbach, Sources and Studies in the History of Mathematics and Physical Sciences, Springer (2011).

    Michel Broniatowski


    Sorbonne Université, Paris

    Title: Some insights on statistical divergences and choice of models

    Abstract: Divergences between probability laws or more generally between measures define inferential criteria, or risk functions. Their estimation makes it possible to deal with the questions of model choice and statistical inference, in connection with the regularity of the models considered; depending on the nature of these models (parametric or semi-parametric), the nature of the criteria and their estimation methods vary. Representations of these divergences as large deviation rates for specific empirical measures allow their estimation in nonparametric or semi parametric models, by making use of information theory results (Sanov's theorem and Gibbs principles), by Monte Carlo methods. The question of the choice of divergence is wide open; an approach linking nonparametric Bayesian statistics and MAP estimators provides elements of understanding of the specificities of the various divergences in the Ali-Silvey-Csiszar-Arimoto class in relation to the specific choices of the prior distributions.


    • Broniatowski, Michel ; Stummer, Wolfgang. Some universal insights on divergences for statistics, machine learning and artificial intelligence. In Geometric structures of information; Signals Commun. Technol., Springer, Cham, pp. 149.211, 2019
    • Broniatowski, Michel. Minimum divergence estimators, Maximum Likelihood and the generalized bootstrap, to appear in "Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems" Entropy, 2020
    • Csiszár, Imre ; Gassiat, Elisabeth. MEM pixel correlated solutions for generalized moment and interpolation problems. IEEE Trans. Inform. Theory 45, no. 7, 2253–2270, 1999
    • Liese, Friedrich; Vajda, Igor. On divergences and informations in statistics and information theory. IEEE Trans. Inform. Theory 52, no. 10, 4394–4412, 2006

    Maurice de Gosson

    Professor, Senior Researcher at the University of Vienna
    Faculty of Mathematics, NuHAG group

    Title: Gaussian states from a symplectic geometry point of view

    Abstract: Gaussian states play an ubiquitous role in quantum information theory and in quantum optics because they are easy to manufacture in the laboratory, and have in addition important extremality properties. Of particular interest are their separability properties. Even if major advances have been made in their study in recent years, the topic is still largely open. In this talk we will discuss separability questions for Gaussian states from a rigorous point of view using symplectic geometry, and present some new results and properties.


    • M. de Gosson, On the Disentanglement of Gaussian Quantum States by Symplectic Rotations. C.R. Acad. Sci. Paris Volume 358, issue 4, 459-462 (2020)
    • M. de Gosson, On Density Operators with Gaussian Weyl symbols, In Advances in Microlocal and Time-Frequency Analysis, Springer (2020)
    • M. de Gosson, Symplectic Coarse-Grained Classical and Semiclassical Evolution of Subsystems: New Theoretical Aspects, J. Math. Phys. no. 9, 092102 (2020)
    • E. Cordero, M. de Gosson, and F. Nicola, On the Positivity of Trace Class Operators, to appear in Advances in Theoretical and Mathematical Physics 23(8), 2061–2091 (2019)
    • E. Cordero, M. de Gosson, and F. Nicola, A characterization of modulation spaces by symplectic rotations, to appear in J. Funct. Anal. 278(11), 108474 (2020)

    Giuseppe LONGO
    Centre Cavaillès, CNRS & Ens Paris and School of Medicine, Tufts University, Boston

    Title: Use and abuse of "digital information" in life sciences, is Geometry of Information a way out?

    Abstract: Since WWII, the war of coding, and the understanding of the structure of the DNA (1953), the latter has been considered as the digital encoding of the Aristotelian Homunculus. Till now DNA is viewed as the "information carrier" of ontogenesis, the main or unique player and pilot of phylogenesis. This heavily affected our understanding of life and reinforced a mechanistic view of organisms and ecosystems, a component of our disruptive attitude towards ecosystemic dynamics. A different insight into DNA as a major constraint to morphogenetic processes brings in a possible "geometry of information" for biology, yet to be invented. One of the challenges is in the need to move from a classical analysis of morphogenesis, in physical terms, to a "heterogenesis" more proper to the historicity of biology.


    • Arezoo Islami, Giuseppe Longo. Marriages of Mathematics and Physics: a challenge for Biology, Invited Paper, in The Necessary Western Conjunction to the Eastern Philosophy of Exploring the Nature of Mind and Life (K. Matsuno et al., eds), Special Issue of Progress in Biophysics and Molecular Biology, Vol 131, Pages 179¬192, December 2017. (DOI) (SpaceTimeIslamiLongo.pdf)
    • Giuseppe Longo. How Future Depends on Past Histories and Rare Events in Systems of Life, Foundations of Science, (DOI), 2017 (biolog-observ-history-future.pdf)
    • Giuseppe Longo. Information and Causality: Mathematical Reflections on Cancer Biology. In Organisms. Journal of Biological Sciences, vo. 2, n. 1, 2018. (BiologicalConseq-ofCompute.pdf)
    • Giuseppe Longo. Information at the Threshold of Interpretation, Science as Human Construction of Sense. In Bertolaso, M., Sterpetti, F. (Eds.) A Critical Reflection on Automated Science – Will Science Remain Human? Springer, Dordrecht, 2019 (Information-Interpretation.pdf)
    • Giuseppe Longo, Matteo Mossio. Geocentrism vs genocentrism: theories without metaphors, metaphors without theories. In Interdisciplinary Science Reviews, 45 (3), pp. 380-405, 2020. (Metaphors-geo-genocentrism.pdf)

    posted in GSI2021 read more
  • Geo-Sci-Info


    Welcome to “Geometric Science of Information” 202 Conference

    On behalf of both the organizing and the scientific committees, it is our great pleasure to welcome all delegates, representatives and participants from around the world to the fifth International SEE conference on “Geometric Science of Information” (GSI’21), sheduled in July 2021.

    GSI’21 benefits from scientific sponsor and financial sponsors.

    The 3-day conference is also organized in the frame of the relations set up between SEE and scientific institutions or academic laboratories such as Ecole Polytechnique, Ecole des Mines ParisTech, INRIA, CentraleSupélec, Institut Mathématique de Bordeaux, Sony Computer Science Laboratories.

    The GSI conference cycle has been initiated by the Brillouin Seminar Team as soon as 2009. The GSI’21 event has been motivated in the continuity of first initiatives launched in 2013 ( at Mines PatisTech, consolidated in 2015 ( at Ecole Polytechnique and opened to new communities in 2017 ( at Mines ParisTech and 2019 ( at ENAC Toulouse. We mention that in 2011, we organized an indo-french workshop on “Matrix Information Geometry” that yielded an edited book in 2013, and in 2017, collaborate to CIRM seminar in Luminy TGSI’17 “Topoplogical & Geometrical Structures of Information” ( Last GSI’19 Proceedings have been edited by SPRINGER in Lecture Notes (

    GSI satellites event have been organized in 2019 and 2020 as, FGSI’19 “Foundation of Geometric Science of Information” in Montpellier and Les Houches Seminar SPIGL’20 “Joint Structures and Common Foundations of Statistical Physics, Information Geometry and Inference for Learning” .

    The technical program of GSI’21 covers all the main topics and highlights in the domain of “Geometric Science of Information” including Information Geometry Manifolds of structured data/information and their advanced applications. This proceedings consists solely of original research papers that have been carefully peer-reviewed by two or three experts before, and revised before acceptance.

    Historical background

    As for the GSI’13, GSI’15, GSI’17, and GSI’19 GSI'21 addresses inter-relations between different mathematical domains like shape spaces (geometric statistics on manifolds and Lie groups, deformations in shape space, ...), probability/optimization & algorithms on manifolds (structured matrix manifold, structured data/Information, ...), relational and discrete metric spaces (graph metrics, distance geometry, relational analysis,...), computational and Hessian information geometry, geometric structures in thermodynamics and statistical physics, algebraic/infinite dimensional/Banach information manifolds, divergence geometry, tensor-valued morphology, optimal transport theory, manifold & topology learning, ... and applications like geometries of audio-processing, inverse problems and signal/image processing. GSI’21 topics were enriched with contributions from Lie Group Machine Learning, Harmonic Analysis on Lie Groups, Geometric Deep Learning, Geometry of Hamiltonian Monte Carlo, Geometric & (Poly)Symplectic Integrators, Contact Geometry & Hamiltonian Control, Geometric and structure preserving discretizations, Probability Density Estimation & Sampling in High Dimension, Geometry of Graphs and Networks and Geometry in Neuroscience & Cognitive Sciences.

    At the turn of the century, new and fruitful interactions were discovered between several branches of science: Information Science (information theory, digital communications, statistical signal processing,), Mathematics (group theory, geometry and topology, probability, statistics, sheaves theory,...) and Physics (geometric mechanics, thermodynamics, statistical physics, quantum mechanics, ...). GSI conference cycle is a tentative to discover joint mathematical structures to all these disciplines by elaboration of a “General Theory of Information” embracing physics science, information science, and cognitive science in a global scheme.

    Frank Nielsen, co-chair : Ecole Polytechnique, Palaiseau, France, Sony Computer Science Laboratories, Tokyo, Japan

    Capture du 2020-11-29 13-17-08.png

    Frédéric Barbaresco, co-chair: President of SEE ISIC Club (Ingénierie des Systèmes d'Information et de Communications),
    Representative of KTD PCC (Key Technology Domain / Processing, Computing & Cognition ) Board, THALES LAND & AIR SYSTEMS, France
    Capture du 2020-11-29 13-15-07.png

    Capture du 2020-11-29 13-00-49.png

    posted in GSI2021 read more
  • Geo-Sci-Info



    As for GSI’13, GSI’15, GSI’17 and GSI’19, the objective of this SEE GSI’21 conference, hosted in PARIS, is to bring together pure/applied mathematicians and engineers, with common interest for Geometric tools and their applications for Information analysis.
    It emphasizes an active participation of young researchers to discuss emerging areas of collaborative research on “Geometric Science of Information and their Applications”.
    Current and ongoing uses of Information Geometry Manifolds in applied mathematics are the following: Advanced Signal/Image/Video Processing, Complex Data Modeling and Analysis, Information Ranking and Retrieval, Coding, Cognitive Systems, Optimal Control, Statistics on Manifolds, Topology/Machine/Deep Learning, Artificial Intelligence, Speech/sound recognition, natural language treatment, Big Data Analytics, Learning for Robotics, etc., which are substantially relevant for industry.
    The Conference will be therefore held in areas of topics of mutual interest with the aim to:
    • Provide an overview on the most recent state-of-the-art
    • Exchange mathematical information/knowledge/expertise in the area
    • Identify research areas/applications for future collaboration

    Provisional topics of interests:

    • Geometric Deep Learning (ELLIS session)
    • Probability on Riemannian Manifolds
    • Optimization on Manifold
    • Shape Space & Statistics on non-linear data
    • Lie Group Machine Learning
    • Harmonic Analysis on Lie Groups
    • Statistical Manifold & Hessian Information Geometry
    • Monotone Embedding in Information Geometry
    • Non-parametric Information Geometry
    • Computational Information Geometry
    • Distance and Divergence Geometries
    • Divergence Statistics
    • Optimal Transport & Learning
    • Geometry of Hamiltonian Monte Carlo
    • Statistics, Information & Topology
    • Graph Hyperbolic Embedding & Learning
    • Inverse problems: Bayesian and Machine Learning interaction
    • Integrable Systems & Information Geometry
    • Geometric structures in thermodynamics and statistical physics
    • Contact Geometry & Hamiltonian Control
    • Geometric and structure preserving discretizations
    • Geometric & Symplectic Methods for Quantum Systems
    • Geometry of Quantum States
    • Geodesic Methods with Constraints
    • Probability Density Estimation & Sampling in High Dimension
    • Geometry of Tensor-Valued Data
    • Geometric Mechanics
    • Geometric Robotics & Learning
    • Topological and geometrical structures in neurosciences

    A special session will deal with:

    • Geometric Structures Coding & Learning Libraries (geomstats, pyRiemann , Pot…)

    Advanced information on article submission and publication
    As for previous editions, GSI’21 Proceedings will be published in SPRINGER LNCS. See GSI’19 Proceedings
    8 pages SPRINGER LNCS format is required for Initial paper submission.
    A detailed call for contributions will be published shortly.

    Capture du 2020-11-29 13-00-49.png

    posted in GSI2021 read more
  • Geo-Sci-Info

    Capture du 2020-11-25 22-15-11.png


    Ph.D. and Postdoc positions in Applied Mathematics

    Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany

    Application deadline: January 10th, 2021

    The Group

    The Chair of Applied Analysis – Alexander von Humboldt Professorship at the Department of Mathematics of the Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU), in Erlangen (Germany), led by Prof. Dr. Enrique Zuazua, is looking for outstanding candidates to fill several

    Ph.D. and Postdoctoral positions

    In the broad area of Applied Mathematics, the Chair develops and applies methods of Analysis, Computational Mathematics and Data Sciences to model, understand and control the dynamics of various phenomena arising in the interphase of Mathematics with Engineering, Physics, Biology and Social Sciences.

    We welcome applications by young and highly motivated scientists to contribute to this exciting joint AvH-FAU effort. Possible research projects include but are not limited to:

    • Analysis of Partial Differential Equations (PDE).
    • The interplay between Data Sciences, numerics of PDE and Control Systems.
    • Control of diffusion models arising in Biology and Social Sciences.
    • Modelling and control of multi-agent systems.
    • Hyperbolic models arising in traffic flow and energy transport.
    • Waves in networks and Markov chains.
    • Fractional PDE.
    • Optimal design in Material Sciences.
    • Micro-macro limit processes.
    • The interplay between discrete and continuous modelling in design and control.
    • The emergence of turnpike phenomena in long-time horizons.
    • Inversion and parameter identification.
    • Recommendation systems.
    • Development of new computation tools and software.

    We look for excellent candidates with expertise in the areas of applied mathematics, PDE analysis, control theory, numerical analysis, data sciences and computational mathematics who enjoy interdisciplinary work.

    The Chair contributes to the development of a new Center of Research at FAU, in the broad area of “Mathematics of Data”, conceived as a highly visible interdisciplinary research site, an incubator for future collaborative research grants and a turntable for the key research priorities of FAU. The recruited candidates will have the added opportunity to participate in this challenging endeavour.

    How to apply

    Applications, including cover/motivation letter, curriculum vitae, list of publications, statement of research and two or three names of experts for reference should be submitted via e-mail as a single pdf file to secretary-aa[at] before January 10th, 2012.

    Any inquiries about the positions should be sent to Prof. Enrique Zuazua (positions-aa[at] Applications will be accepted until the positions are filled.

    FAU is a member of “The Family in Higher Education Institutions” best practice club and also aims to increase the number of women in scientific positions. Female candidates are therefore particularly encouraged to apply. In case of equal qualifications, candidates with disabilities will take precedence.

    For more detailed information about the Chair, please visit Chair of Applied Analysis – Alexander von Humboldt Professorship

    posted in Jobs offers - Call for projects read more
  • Geo-Sci-Info

    GeomLoss : Geometric Loss functions between sampled measures, images and volumes

    Find all the docs and tutorials of the version 0.2.3 in the read the docs website:

    N.B.: This is still an alpha release! Please send me your feedback: I will polish the user interface, implement Hausdorff divergences, add support for meshes, images, volumes and clean the documentation over the summer of 2020.

    The GeomLoss library provides efficient GPU implementations for:

    • Kernel norms (also known as Maximum Mean Discrepancies).

    • Hausdorff divergences, which are positive definite generalizations of the ICP loss, analogous to log-likelihoods of Gaussian Mixture Models.

    • Unbiased Sinkhorn divergences, which are cheap yet positive definite approximations of Optimal Transport (Wasserstein) costs.

    These loss functions, defined between positive measures, are available through the custom PyTorch layers SamplesLoss, ImagesLoss and VolumesLoss which allow you to work with weighted point clouds (of any dimension), density maps and volumetric segmentation masks. Geometric losses come with three backends each:

    • A simple tensorized implementation, for small problems (< 5,000 samples).

    • A reference online implementation, with a linear (instead of quadratic) memory footprint, that can be used for finely sampled measures.

    • A very fast multiscale code, which uses an octree-like structure for large-scale problems in dimension <= 3.

    GeomLoss is a simple interface for cutting-edge Optimal Transport algorithms. It provides:

    • Support for batchwise computations.
    • Linear (instead of quadratic) memory footprint for large problems, relying on the KeOps library for map-reduce operations on the GPU.
    • Fast kernel truncation for small bandwidths, using an octree-based structure.
    • Log-domain stabilization of the Sinkhorn iterations, eliminating numeric overflows for small values of 𝜀
    • Efficient computation of the gradients, which bypasses the naive backpropagation algorithm.
    • Support for unbalanced Optimal Transport, with a softening of the marginal constraints through a maximum reach parameter.
    • Support for the ε-scaling heuristic in the Sinkhorn loop, with kernel truncation in dimensions 1, 2 and 3. On typical 3D problems, our implementation is 50-100 times faster than the standard SoftAssign/Sinkhorn algorithm.

    Note, however, that SamplesLoss does not implement the Fast Multipole or Fast Gauss transforms. If you are aware of a well-packaged implementation of these algorithms on the GPU, please contact me!

    The divergences implemented here are all symmetric, positive definite and therefore suitable for measure-fitting applications. For positive input measures 𝛼 and 𝛽, our Loss

    functions are such that
    Loss(𝛼,𝛽) = Loss(𝛽,𝛼),
    0 = Loss(𝛼,𝛼) ⩽ Loss(𝛼,𝛽),
    0 = Loss(𝛼,𝛽) ⟺ 𝛼=𝛽.

    GeomLoss can be used in a wide variety of settings, from shape analysis (LDDMM, optimal transport…) to machine learning (kernel methods, GANs…) and image processing. Details and examples are provided below:

    GeomLoss is licensed under the MIT license.

    Author and Contributors

    Feel free to contact us for any bug report or feature request:

    Related projects

    You may be interested by:

    • The KeOps library, which provides efficient CUDA routines for point cloud processing, with full PyTorch support.

    • Rémi Flamary and Nicolas Courty’s Python Optimal Transport library, which provides a reference implementation of OT-related methods for small problems.

    • Bernhard Schmitzer’s Optimal Transport toolbox, which provides a reference multiscale solver for the OT problem, on the CPU.

    posted in GSI FORGE read more
Internal error.

Oops! Looks like something went wrong!