
GeoSciInfo
Company presentation
Since 2002, Median Technologies has been expanding the boundaries of the identification, interpretation, analysis and reporting of imaging data in the medical world. Our core activity is to develop advanced imaging software solutions and platforms for clinical drug development in oncology, diagnostic support, and cancer patient care. Our software solutions improve the management of cancer patients by helping to better identify pathologies, develop and select patientspecific therapies (precision medicine).The company employs a highlyqualified team and leverages its scientific, technical, medical, and regulatory expertise to develop innovative medical imaging analysis software based on Artificial Intelligence, cloud computing and big data. We are driven by our core values that are essential to us. These values define who we are, what we do, the way we do it, and what we, as Median, aspire to:
• Leading innovation with purpose
• Committing to quality in all we do
• Supporting our customers in achieving their goals
• Always remembering to put the patient firstToday, we are a team of 130+ people. Most of us are based at our HQ, in Sophia Antipolis (French Riviera) and we have a subsidiary in the US and another one in China. Our company is growing in a fulfilling international and multicultural environment.
Job description
In the context of our research and development in artificial intelligence applied to medical imaging, we are looking for: Data Science and Machine Learning Research Scientist M/FIntegrated into a multidisciplinary research and development team within the iBiopsy® project, you are a scientist in the research and development of innovative medical imaging solutions using machine learning and other AI methods.
Medical imaging is one of the fastest growing fields in machine learning. We are looking for an enthusiastic, dynamic, and organized Data Scientist with strong ML experience, excellent communication skills who will thrive at the heart of technological innovation.
Assignments
o Position under the supervision of Head of Data Scienceo Responsibilities:

You will apply your AI/ML/Deep Learning knowledge to develop innovative and robust biomarkers using data coming from medical imaging systems such as MRI and CT scanners and other data sources.

Your work will involve research and development of novel machine learning algorithms and systems. Being part of our frontend innovation organization, you will actively scout, keep track of, evaluate, and leverage disruptive technologies, and emerging industrial, academic and technological trends.

You will work closely with iBiopsy’s software development team as well as clinical science team.

In addition, you will transfer technology, and share insights and best practices across innovation teams. You will generate intellectual property for the company. You will be expected to author peer reviewed papers, present results at industry/scientific conferences.

We look at you to building breakthrough AIenabled imaging solutions leveraging cloud computing and apply supervised and unsupervised Machine Learning techniques to create value from the imaging and clinical data repositories generated by our medical research and pharmaceutical industry partners. These AI enabled systems and services go beyond image analysis to transform medical practice and drug development.
Profile required
o Education: PhD in in Mathematics, Computer Science or related fieldso Main skills and Experience required:
• Minimum 3 years of relevant work experience in (deep) machine learning
• Experience with Medical Imaging, CT/MRI, image signatures, large scale visual information retrieval, features selection
• Relevant experience with Python, DL frameworks (i.e. Pytorch) and standard packages such as Scikitlearn, Numpy, Scipy, Pandas
• SemiSupervised Learning, Selfsupervised Learning, Reinforcement Learning, Adversarial methods.
• Multimodal feature extraction
• Author on related research publication / conferences
• Strong experience with opensource technologies to accelerate innovationKnowledge:
• In depth technical knowledge of AI, deep learning and computer vision
• Strong fundamental knowledge of statistical data processing, regression techniques, neural networks, decision trees, clustering, pattern recognition, probability theory, stochastic systems, Bayesian inference, statistical techniques and dimensionality reductionAdditional qualities:
• Strong interpersonal, communication and presentation skills as well as ability to work in global team
• Fluent in written and oral English 

GeoSciInfo
Bourses doctorales : CNRS et University of Tokyo
Le CNRS et the University of Tokyo financeront des bourses doctorales en sciences humaines et sociales, intelligence artificielle, science quantique, changement climatique et biologie moléculaire et cellulaire. Date limite des candidatures le 22 avril 2021.
Pour candidater : https://international.cnrs.fr/wpcontent/uploads/2021/02/GuidelinesPhDJointprogramCNRSUTokyo1.pdf

GeoSciInfo
We offer in total 8 funded PhD positions associated with the graduate sat the interface between Cognitive Science, Machine learning, Computational Neuroscience, and AI. chool on "Computational Cognition » (https://www.comcocms.uniosnabrueck.de/en/open_positions.html),
The RTG Computational Cognition aims at reintegrating Cognitive Science and Artificial Intelligence. PhD students of the RTG will be educated in both subjects in order to combine the findings of these fields and thus to get a better understanding of human and machine intelligence. Research fields involved in the RTG are Neuroinformatics, NeuroBioPsychology, BioInspired Computer Vision, KnowledgeBased Systems, Cognitive Natural Language Processing & Communication, Cognitive Modeling, Artificial Intelligence, Psycho/Neurolinguistics, Computational Linguistics and Cognitive Computing.
The RTG focuses on the integration of two research fields. Further information on the RTG is available at www.comco.uniosnabrueck.de. Detailed information on the core areas of the offered PhD projects can be obtained from the spokesmen of the RTG, Prof. Dr. Gordon Pipa (gpipa[at]uniosnabrueck.de) and Prof. Dr. Peter König (pkoenig[at]uniosnabrueck.de).The RTG is incorporated into the Cognitive Science PhD program founded in 2002. PhD students of the RTG will take advantage of an interdisciplinary environment, which nevertheless focuses on a common research topic and offers a broad range of methodological synergies between the projects.
Required Qualifications:
Applicants are expected to have an academic degree (Master/Diploma), experience in at least one of the domains listed above, proven experience in interdisciplinary work as well as a good command of the English language.Osnabrück University is committed to helping working/studying parents balance their family and working lives.
Osnabrück University seeks to guarantee equality of opportunity for women and men and strives to correct any gender imbalance in its schools and departments.
If two candidates are equally qualified, preference will be given to the candidate with disability status.
Applications with the usual documentation should be submitted by email in a single PDF file to the director of the Institute of Cognitive Science, Prof. Dr. Gunther Heidemann (gheidema[at]uniosnabrueck.de) with a cc to office[at]ikw.uniosnabrueck.de no later than April 19, 2021.
To inquire additional information on for example specific research projects you can contact the coordinator Gabriela Pipa (gapipa[at]uos.de).

Professor and Chair of the Neuroinformatics Department
Dr. rer. nat. Gordon Pipa
Institute of Cognitive Science, Room 50/218
University of Osnabrueck
Wachsbleiche 27, 49090 Osnabrück, Germanytel. +49 (0) 5419692277
fax (private). +49 (0) 5405 500 80 98
home office. +49 (0) 5405 500 90 95
email: gpipa[at]uos.de
webpage: http://www.ni.uos.de
research gate: https://www.researchgate.net/profile/Gordon_Pipa/?ev=prf_act
linkedin: https://de.linkedin.com/in/gordonpipa47771539Personal Assistent and Secretary of the Neuroinformatic lab:
Anna Jungeilges
Tel. +49 (0)541 9692390
Fax +49 (0)541 9692246
Email: anna.jungeilges[at]uniosnabrueck.de
visit us on
http://www.facebook.com/CognitiveScienceOsnabruck
https://twitter.com/#!/CogSciUOS 
GeoSciInfo
Statistics, Information and Topology
Cochairs of the session:
 Michel N'Guiffo Boyom: Université Toulouse
 Pierre Baudot: Median (link)
This session will focus on the advances of information theory, probability and statistics in Algebraic Topology (see [156] bellow). The field is currently knowing an impressive development, both on the side of the categorical, homotopical, or topos foundations of probability theorie and statistics, and of the information functions characterisation in cohomology and homotopy theory.
Bliographicical references: (to be completed)
[1] Cencov, N.N. Statistical Decision Rules and Optimal Inference. Translations of Mathematical Monographs. 1982.
[2] Ay, N. and Jost, J. and Lê, H.V. and Schwachhöfer, L. Information geometry and sufficient statistics. Probability Theory and Related Fields 2015 PDF
[3] Cathelineau, J. Sur l’homologie de sl2 a coefficients dans l’action adjointe, Math. Scand., 63, 5186, 1988. PDF
[4] Kontsevitch, M. The 1+1/2 logarithm. Unpublished note, Reproduced in ElbazVincent & Gangl, 2002, 1995 PDF
[5] ElbazVincent, P., Gangl, H. On poly(ana)logs I., Compositio Mathematica, 130(2), 161214. 2002. PDF
[6] Tomasic, I., Independence, measure and pseudofinite fields. Selecta Mathematica, 12 271306. Archiv. 2006.
[7] Connes, A., Consani, C., Characteristic 1, entropy and the absolute point. preprint arXiv:0911.3537v1. 2009.
[8] Marcolli, M. & Thorngren, R. Thermodynamic Semirings, arXiv 10.4171 / JNCG/159, Vol. abs/1108.2874, 2011.
[9] Abramsky, S., Brandenburger, A., The Sheaftheoretic structure of nonlocality and contextuality, New Journal of Physics, 13 (2011). PDF
[10] Gromov, M. In a Search for a Structure, Part 1: On Entropy, unpublished manuscript, 2013. PDF
[11] McMullen, C.T., Entropy and the clique polynomial, 2013. PDF
[12] Marcolli, M. & Tedeschi, R. Entropy algebras and Birkhoff factorization, arXiv, Vol. abs/1108.2874, 2014.
[13] Doering, A., Isham, C.J., Classical and Quantum Probabilities as Truth Values, arXiv:1102.2213, 2011 PDF
[14] Baez, J.; Fritz, T. & Leinster, T. A Characterization of Entropy in Terms of Information Loss Entropy, 13, 19451957, 2011. PDF
[15] Baez J. C.; Fritz, T. A Bayesian characterization of relative entropy. Theory and Applications of Categories, Vol. 29, No. 16, p. 422456. 2014. PDF
[16] DrummondCole, G.C., Park., J.S., Terrila, J., Homotopy probability theory I. J. Homotopy Relat. Struct. November 2013. PDF
[17] DrummondCole, G.C., Park., J.S., Terrila, J., Homotopy probability theory II. J. Homotopy Relat. Struct. April 2014. PDF
[18] Burgos Gil J.I., Philippon P., Sombra M., Arithmetic geometry of toric varieties. Metrics, measures and heights, Astérisque 360. 2014 . PDF
[19] Gromov, M. Symmetry, probability, entropy. Entropy 2015. PDF
[20] Gromov, M. Morse Spectra, Homology Measures, Spaces of Cycles and Parametric Packing Problems, april 2015. PDF
[21] Park., J.S., Homotopy theory of probability spaces I: classical independence and homotopy Lie algebras. Archiv . 2015
[22] Baudot P., Bennequin D. The homological nature of entropy. Entropy, 17, 166; 2015. PDF
[23] ElbazVincent, P., Gangl, H., Finite polylogarithms, their multiple analogues and the Shannon entropy. (2015) Vol. 9389 Lecture Notes in Computer Science. 277285, Archiv.
[24] M. Marcolli, Information algebras and their applications. International Conference on Geometric Science of Information (2015), 271276
[25] Abramsky S., Barbosa R.S., Lal K.K.R., Mansfield, S., Contextuality, Cohomology and Paradox. 2015. arXiv:1502.03097
[26] M. Nguiffo Boyom, FoliationsWebsHessian GeometryInformation GeometryEntropy and Cohomology. Entropy 18(12): 433 (2016) PDF
[27] M. Nguiffo Boyom, A. Zeglaoui, Amari Functors and Dynamics in Gauge Structures. GSI 2017: 170178
[28] G.C. DrummondCole, Terila, Homotopy probability theory on a Riemannian manifold and the Euler equation , New York Journal of Mathematics, Volume 23 (2017) 10651085. PDF
[29] P. Forré,, JM. Mooij. Constraintbased Causal Discovery for NonLinear Structural Causal Models with Cycles and Latent Confounders. In A. Globerson, & R. Silva (Eds.) (2018), pp. 269278)
[30] T. Fritz and P. Perrone, Bimonoidal Structure of Probability Monads. Proceedings of MFPS 34, ENTCS, (2018). PDF
[31] JaeSuk Park, Homotopical Computations in Quantum Fields Theory, (2018) arXiv:1810.09100 PDF
[32] G.C. DrummondCole, An operadic approach to operatorvalued free cumulants. Higher Structures (2018) 2, 42–56. PDF
[33] G.C. DrummondCole, A noncrossing word cooperad for free homotopy probability theory. MATRIX Book (2018) Series 1, 77–99. PDF
[34] T. Fritz and P. Perrone, A Probability Monad as the Colimit of Spaces of Finite Samples, Theory and Applications of Categories 34, 2019. PDF.
[35] M. Esfahanian, A new quantum probability theory, quantum information functor and quantum gravity. (2019) PDF
[36] T. Leinster, Entropy modulo a prime, (2019) arXiv:1903.06961 PDF
[37] T. Leinster, E. Roff, The maximum entropy of a metric space, (2019) arXiv:1908.11184 PDF
[38] T. Maniero, Homological Tools for the Quantum Mechanic. arXiv 2019, arXiv:1901.02011. PDF
[39] M. Marcolli, Motivic information, Bollettino dell'Unione Matematica Italiana (2019) 12 (12), 1941
[40] J.P. Vigneaux, Information theory with finite vector spaces, in IEEE Transactions on Information Theory, vol. 65, no. 9, pp. 56745687, Sept. (2019)
[41] Baudot P., Tapia M., Bennequin, D. , Goaillard J.M., Topological Information Data Analysis. (2019), Entropy, 21(9), 869
[42] Baudot P., The PoincaréShannon Machine: Statistical Physics and Machine Learning aspects of Information Cohomology. (2019), Entropy , 21(9),
[43] G. SergeantPerthuis, Bayesian/Graphoid intersection property for factorisation models, (2019), arXiv:1903.06026
[44] J.P. Vigneaux, Topology of Statistical Systems: A Cohomological Approach to Information Theory, PhD Thesis (2019).
[45] Forré, P., & Mooij, J. M. (2019). Causal Calculus in the Presence of Cycles, Latent Confounders and Selection Bias. In A. Globerson, & R. Silva (Eds.), Proceedings of the ThirtyFifth Conference on Uncertainty in Artificial Intelligence: UAI 2019, (2019)
[46] Y. Manin, M. Marcolli Homotopy Theoretic and Categorical Models of Neural Information Networks. arXiv (2020) preprint arXiv:2006.15136
[47] T. Leinster The categorical origins of Lebesgue integration (2020) arXiv:2011.00412 PDF
[48] T. Fritz, T. Gonda, P. Perrone, E. Fjeldgren Rischel, Representable Markov Categories and Comparison of Statistical Experiments in Categorical Probability. (2020) arXiv:2010.07416 PDF
[49] T. Fritz, E. Fjeldgren Rischel, Infinite products and zeroone laws in categorical probability (2020) arXiv:1912.02769 PDF
[50] T. Fritz, A synthetic approach to Markov kernels, conditional independence and theorems on sufficient statistics (2020) arXiv:1908.07021 PDF
[51] T. Fritz and P. Perrone, Stochastic Order on Metric Spaces and the Ordered Kantorovich Monad, Advances in Mathematics 366, 2020. PDF
[52] T. Fritz and P. Perrone, Monads, partial evaluations, and rewriting. Proceedings of MFPS 36, ENTCS, 2020. PDF.
[53] D. Bennequin. G. SergeantPerthuis, O. Peltre, and J.P. Vigneaux, Extrafine sheaves and interaction decompositions, (2020) arXiv:2009.12646
[54] J.P. Vigneaux, Information structures and their cohomology, in Theory and Applications of Categories, Vol. 35, (2020), No. 38, pp 14761529.
[55] O. Peltre, MessagePassing Algorithms and Homology, PhD Thesis (2020), arXiv:2009.11631
[56] G. SergeantPerthuis, Interaction decomposition for presheafs, (2020) arXiv:2008.09029
[57] K. Hess, Topological adventures in neuroscience, in the Proceedings of the 2018 Abel Symposium: Topological Data Analysis, Springer Verlag, (2020).
[58] C. Curto, N. Youngs. Neural ring homomorphisms and maps between neural codes. Submitted. arXiv.org preprint.
[59] N.C. Combe, Y, Manin, Fmanifolds and geometry of information, arXiv:2004.08808v.2, (2020) Bull. London MS.
[60] Abramsky, S. , Classical logic, classical probability, and quantum mechanics 2020 arXiv:2010.13326 
GeoSciInfo
Topology and geometry in neuroscience
chairs of the sessions
Topics
This session will focus on the advances on Algebraic Topology and Geometrical methods in neurosciences (see [1105] bellow, among many others). The field is currently knowing an impressive development coming both:_ from theoretical neuroscience and machine learning fields, like Graph Neural Networks [3042], Bayesian geometrical inference [2729], Message Passing, probability and cohomology [9295], Information Topology [5354,6266,96105] or Networks [8385,9091], higher order nbody statistical interactions [67,74,9495,99,101]
_ from topological data analysis applications to real neural recordings, ranging from subcellular [43,51] genetic or omic expressions [81,101], spiking dynamic and neural coding [125,4547,5052,79], to cortical areas fMRI, EEG [26,6772,7680,8489], linguistic [5461] and consciousness [48,53,102].
Bibliographical references: (to be completed)
Carina Curto, Nora Youngs and Vladimir Itskov and colleagues:
[1] C. Curto, N. Youngs. Neural ring homomorphisms and maps between neural codes. Submitted. arXiv.org preprint.
[2] C. Curto, J. Geneson, K. Morrison. Fixed points of competitive thresholdlinear networks. Neural Computation, in press, 2019. arXiv.org preprint.
[3] C. Curto, A. VelizCuba, N. Youngs. Analysis of combinatorial neural codes: an algebraic approach. Book chapter in Algebraic and Combinatorial Computational Biology. R. Robeva, M. Macaulay (Eds), 2018.
[4] C. Curto, V. Itskov. Combinatorial neural codes. Handbook of Discrete and Combinatorial Mathematics, Second Edition, edited by Kenneth H. Rosen, CRC Press, 2018. pdf
[5] C. Curto, E. Gross, J. Jeffries, K. Morrison, M. Omar, Z. Rosen, A. Shiu, N. Youngs. What makes a neural code convex? SIAM J. Appl. Algebra Geometry, vol. 1, pp. 222238, 2017. pdf, SIAGA link, and arXiv.org preprint
[6] C. Curto. What can topology tells us about the neural code? Bulletin of the AMS, vol. 54, no. 1, pp. 6378, 2017. pdf, Bulletin link.
[7] C. Curto, K. Morrison. Pattern completion in symmetric thresholdlinear networks. Neural Computation, Vol 28, pp. 28252852, 2016. pdf, arXiv.org preprint.
[8] C. Giusti, E. Pastalkova, C. Curto, V. Itskov. Clique topology reveals intrinsic geometric structure in neural correlations. PNAS, vol. 112, no. 44, pp. 1345513460, 2015. pdf, PNAS link.
[9] C. Curto, A. Degeratu, V. Itskov. Encoding binary neural codes in networks of thresholdlinear neurons. Neural Computation, Vol 25, pp. 28582903, 2013. pdf, arXiv.org preprint.
[10] K. Morrison, C. Curto. Predicting neural network dynamics via graphical analysis. Book chapter in Algebraic and Combinatorial Computational Biology. R. Robeva, M. Macaulay (Eds), 2018. arXiv.org preprint,
[11] C. Curto, V. Itskov, A. VelizCuba, N. Youngs. The neural ring: an algebraic tool for analyzing the intrinsic structure of neural codes. Bulletin of Mathematical Biology, Volume 75, Issue 9, pp. 15711611, 2013. arXiv.org preprint.
[12] C. Curto, V. Itskov, K. Morrison, Z. Roth, J.L. Walker. Combinatorial neural codes from a mathematical coding theory perspective. Neural Computation, Vol 25(7):18911925, 2013. arXiv.org preprint.
[13] C. Curto, A. Degeratu, V. Itskov. Flexible memory networks. Bulletin of Mathematical Biology, Vol 74(3):590614, 2012. arXiv.org preprint.
[14] V. Itskov, C. Curto, E. Pastalkova, G. Buzsaki. Cell assembly sequences arising from spike threshold adaptation keep track of time in the hippocampus. Journal of Neuroscience, Vol. 31(8):28282834, 2011.
[15] K.D. Harris, P. Bartho, et al.. How do neurons work together? Lessons from auditory cortex. Hearing Research, Vol. 271(12), 2011, pp. 3753.
[16] P. Bartho, C. Curto, A. Luczak, S. Marguet, K.D. Harris. Population coding of tone stimuli in auditory cortex: dynamic rate vector analysis. European Journal of Neuroscience, Vol. 30(9), 2009, pp. 17671778.
[17] C. Curto, V. Itskov. Cell groups reveal structure of stimulus space. PLoS Computational Biology, Vol. 4(10): e1000205, 2008.
[18] E. Gross , N. K. Obatake , N. Youngs, Neural ideals and stimulus space visualization, Adv. Appl.Math., 95 (2018), pp. 65–95.
[19] C. Giusti, V. Itskov. A nogo theorem for onelayer feedforward networks. Neural Computation, 26 (11):25272540, 2014.
[20] V. Itskov, L.F. Abbott. Capacity of a Perceptron for Sparse Discrimination . Phys. Rev. Lett. 101(1), 2008.
[21] V. Itskov, E. Pastalkova, K. Mizuseki, G. Buzsaki, K.D. Harris. Thetamediated dynamics of spatial information in hippocampus. Journal of Neuroscience, 28(23), 2008.
[22] V. Itskov, C. Curto, K.D. Harris. Valuations for spike train prediction. Neural Computation, 20(3), 644667, 2008.
[23] E. Pastalkova, V. Itskov , A. Amarasingham , G. Buzsaki. Internally Generated Cell Assembly Sequences in the Rat Hippocampus. Science 321(5894):1322  1327, 2008.
[24] V. Itskov, A. Kunin, Z. Rosen. Hyperplane neural codes and the polar complex. To appear in the Abel Symposia proceedings, Vol. 15, 2019.Alexander Ruys de Perez and colleagues:
[25] A. Ruys de Perez, L.F. Matusevich, A. Shiu, Neural codes and the factor complex, Advances in Applied Mathematics 114 (2020).
Sunghyon Kyeong and colleagues:
[26] Sunghyon Kyeong, Seonjeong Park, KeunAh Cheon, JaeJin Kim, DongHo Song, and Eunjoo Kim, A New Approach to Investigate the Association between Brain Functional Connectivity and Disease Characteristics of AttentionDeficit/Hyperactivity Disorder: Topological Neuroimaging Data Analysis, PLOS ONE, 10 (9): e0137296, DOI: 10.1371/journal.pone.0137296 (2015)
Jonathan Pillow and colleagues:
[27] Aoi MC & Pillow JW (2017). Scalable Bayesian inference for highdimensional neural receptive fields. bioRxiv 212217; doi: https://doi.org/10.1101/212217
[28] Aoi MC, Mante V, & Pillow JW. (2020). Prefrontal cortex exhibits multidimensional dynamic encoding during decisionmaking. Nat Neurosci.
[29] Calhoun AJ, Pillow JW, & Murthy M. (2019). Unsupervised identification of the internal states that shape natural behavior. Nature Neuroscience 22:204020149.
[30] Dong X, Thanou D, Toni L, et al., 2020, Graph Signal Processing for Machine Learning: A Review and New Perspectives, Ieee Signal Processing Magazine, Vol:37, ISSN:10535888, Pages:117127Michael Bronstein, Federico Monti, Giorgos Bouritsas and colleagues:
[31] G. Bouritsas, F. Frasca, S Zafeiriou, MM Bronstein, Improving graph neural network expressivity via subgraph isomorphism counting. arXiv (2020) preprint arXiv:2006.09252
[32] M. Bronstein , G. Pennycook, L. Buonomano, T.D. Cannon, Belief in fake news, responsiveness to cognitive conflict, and analytic reasoning engagement, Thinking and Reasoning (2020), ISSN: 13546783
[33] X. Dong, D. Thanou, L. Toni, M. Bronstein, P. Frossard, Graph Signal Processing for Machine Learning: A Review and New Perspectives, IEEE Signal Processing Magazine (2020), Vol: 37, Pages: 117127, ISSN: 10535888
[34] Y. Wang, Y. Sun, Z. Liu, S.E. Sarma, M. Bronstein, J.M. Solomon, Dynamic Graph CNN for Learning on Point Clouds, ACM Transactions on graphics (2020), Vol: 38, ISSN: 07300301
[35] M. Bronstein, J. Everaert, A. Castro, J. Joormann, T. D. Cannon, Pathways to paranoia: Analytic thinking and belief flexibility., Behav Res Ther (2019), Vol: 113, Pages: 1824
[36] G. Bouritsas, S. Bokhnyak, S. Ploumpis, M. Bronstein, S. Zafeiriou, Neural 3D Morphable Models: Spiral Convolutional Networks for 3D Shape Representation Learning and Generation, (2019) IEEE/CVF ICCV 2019, 7212
[37] O. Litany, A. Bronstein, M. Bronstein, A. Makadia et al., Deformable Shape Completion with Graph Convolutional Autoencoders (2018), Pages: 18861895, ISSN: 10636919
[38] R. Levie, F. Monti, X. Bresson X, M. Bronstein, CayleyNets: Graph Convolutional Neural Networks with Complex Rational Spectral Filters, IEEE Transactions on Signal Processing (2018), Vol: 67, Pages: 97109, ISSN: 1053587X
[39] F. Monti, K. Otness, M. Bronstein, Motifnet: a motifbased graph convolutional network for directed graphs (2018), Pages: 225228
[40] F. Monti, M. Bronstein, X. Bresson, Geometric matrix completion with recurrent multigraph neural networks, Neural Information Processing Systems (2017), Pages: 37003710, ISSN: 10495258
[41] F. Monti F, D. Boscaini, J. Masci, E. Rodola, J. Svoboda, M. Bronstein, Geometric deep learning on graphs and manifolds using mixture model CNNs, (2017) IEEE Conference on Computer Vision and Pattern Recognition, p: 33
[42] M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, P. Vandergheynst et al., Geometric Deep Learning Going beyond Euclidean data, IEEE Signal Processing Magazine (2017), Vol: 34, Pages: 1842, ISSN: 10535888Kathryn Hess and colleagues:
[43] L. Kanari, H. Dictus, W. Van Geit, A. Chalimourda, B. Coste, J. Shillcock, K. Hess, and H. Markram, Computational synthesis of cortical dendritic morphologies, bioRvix (2020) 10.1101/2020.04.15.040410, submitted.
[44] G. Tauzin, U. Lupo, L. Tunstall, J. Burella Prez, M. Caorsi, A. MedinaMardones, A, Dassatti, and K. Hess, giottotda: a topological data analysis toolkit for machine learning and data exploration, arXiv:2004.02551
[45] E. Mullier, J. Vohryzek, A. Griffa, Y. AlemànGómez, C. Hacker, K. Hess, and P. Hagmann, Functional brain dynamics are shaped by connectome nsimplicial organization, (2020) submitted.
[46] M. Fournier, M. Scolamiero, etal., Topology predicts longterm functional outcome in early psychosis, Molecular Psychiatry (2020). https://doi.org/10.1038/s4138002008261.
[47] K. Hess, Topological adventures in neuroscience, in the Proceedings of the 2018 Abel Symposium: Topological Data Analysis, Springer Verlag, (2020).
[48] A. Doerig, A. Schurger, K. Hess, and M. H. Herzog, The unfolding argument: why IIT and other causal structure theories of consciousness are empirically untestable, Consciousness and Cognition 72 (2019) 4959.
[49] L. Kanari, S. Ramaswamy, et al., Objective classification of neocortical pyramidal cells, Cerebral Cortex (2019) bhy339, https://doi.org/10.1093/cercor/bhy339.
[50] J.B. Bardin, G. Spreemann, K. Hess, Topological exploration of artificial neuronal network dynamics, Network Neuroscience (2019) https://doi.org/10.1162/netn_a_00080.
[51] L. Kanari, P. Dłotko, M. Scolamiero, R. Levi, J. C. Shillcock, K. Hess, and H. Markram, A topological representation of branching morphologies, Neuroinformatics (2017) doi: 10.1007/s1202101793411.
[52] M. W. Reimann, M. Nolte,et al., Cliques of neurons bound into cavities provide a missing link between structure and function, Front. Comput. Neurosci., 12 June (2017), doi: 10.3389/fncom.2017.00048.Mathilde Marcoli, Yuri Manin, and colleagues:
[53] Y. Manin, M. Marcolli Homotopy Theoretic and Categorical Models of Neural Information Networks. arXiv (2020) preprint arXiv:2006.15136
[54] M. Marcolli, Lumen Naturae: Visions of the Abstract in Art and Mathematics, MIT Press (2020)
[55] A. Port, T. Karidi, M. Marcolli, Topological Analysis of Syntactic Structures (2019) arXiv preprint arXiv:1903.05181
[56] M. Marcolli, Motivic information, Bollettino dell'Unione Matematica Italiana (2019) 12 (12), 1941
[57] A. Port, I. Gheorghita, D. Guth, J.M. Clark, C. Liang, S. Dasu, M. Marcolli, Persistent topology of syntax, Mathematics in Computer Science (2018) 12 (1), 3350 20
[58] K. Shu, S. Aziz, VL Huynh, D Warrick, M Marcolli, Syntactic phylogenetic trees, Foundations of Mathematics and Physics One Century After Hilbert (2018), 417441
[59] K. Shu, A. Ortegaray, R Berwick, M Marcolli Phylogenetics of IndoEuropean language families via an algebrogeometric analysis of their syntactic structures. arXiv (2018) preprint arXiv:1712.01719
[60] K. Shu, M. Marcolli, Syntactic structures and code parameters Mathematics in Computer Science (2018) 11 (1), 7990
[61] K Siva, J Tao, M Marcolli. Syntactic Parameters and Spin Glass Models of Language Change Linguist. Anal (2017) 41 (34), 559608
[62] M. Marcolli, N. Tedeschi, Entropy algebras and Birkhoff factorization. Journal of Geometry and Physics (2015) 97, 243265
[63] M. Marcolli, Information algebras and their applications. International Conference on Geometric Science of Information (2015), 271276
[64] K. Siva, J. Tao, M. Marcolli Spin glass models of syntax and language evolution, arXiv preprint (2015) arXiv:1508.00504
[65] Y. Manin, M. Marcolli, Kolmogorov complexity and the asymptotic bound for errorcorrecting codes Journal of Differential Geometry (2014) 97 (1), 91108
[66] M. Marcolli, R. Thorngren, Thermodynamic semirings, ArXiv preprint (2011) arXiv:1108.2874Bosa Tadić and colleagues:
[67] M. Andjelkovic, B. Tadic, R. Melnik, The topology of higherorder complexes associated with brainfunction hubs in human connectomes , available on arxiv.org/abs/2006.10357, published in Scientific Reports 10:17320 (2020)
[68] B. Tadic, M. Andjelkovic, M. Suvakov, G.J. Rodgers, Magnetisation Processes in Geometrically Frustrated Spin Networks with SelfAssembled Cliques, Entropy 22(3), 336 (2020)
[69] B. Tadic, M. Andjelkovic, R. Melnik, Functional Geometry of Human Connectomes published in ScientificReports Nature:ScientificReports 9:12060 (2019) previous version: Functional Geometry of Human Connectome and Robustness of Gender Differences, arXiv preprint arXiv:1904.03399 April 6, 2019
[70] B. Tadic, M. Andjelkovic, M. Suvakov, Origin of hyperbolicity in braintobrain coordination networks, FRONTIERS in PHYSICS vol.6, ARTICLE{10.3389/fphy.2018.00007}, (2018) OA
[71] B. Tadic, M. Andjelkovic, Algebraic topology of multibrain graphs: Methods to study the social impact and other factors onto functional brain connections, in Proceedings of BELBI (2016)
[72] B. Tadic, M. Andjelkovic, B.M. Boskoska, Z. Levnajic, Algebraic Topology of MultiBrain Connectivity Networks Reveals Dissimilarity in Functional Patterns during Spoken Communications, PLOS ONE Vol 11(11), e0166787 (2016)
[73] M. Mitrovic and B. Tadic, Search for Weighted Subgraphs on Complex Networks with MLM, Lecture Notes in Computer Science, Vol. 5102 pp. 551558 (2008)Giovanni Petri, Francesco Vaccarino and collaborators:
[74] F. Battiston, G. Cencetti, et al., Networks beyond pairwise interactions: structure and dynamics, Physics Reports (2020), arXiv:2006.01764
[75] M. Guerra, A. De Gregorio, U. Fugacci, G. Petri, F. Vaccarino, Homological scaffold via minimal homology bases. arXiv (2020) preprint arXiv:2004.11606
[76] J. Billings, R. Tivadar, M.M. Murray, B. Franceschiello, G. Petri, Topological Features of Electroencephalography are ReferenceInvariant, bioRxiv 2020
[77] J. Billings, M. Saggar, S. Keilholz, G. Petri, Topological Segmentation of TimeVarying Functional Connectivity Highlights the Role of Preferred Cortical Circuits, bioRxiv 2020
[78] E. IbáñezMarcelo, L. Campioni, et al., Topology highlights mesoscopic functional equivalence between imagery and perception: The case of hypnotizability. NeuroImage (2019) 200, 437449
[79] P. Expert, L.D. Lord, M.L. Kringelbach, G. Petri. Topological neuroscience. Network Neuroscience (2019) 3 (3), 653655
[80] C. Geniesse, O. Sporns, G. Petri, M. Saggar, Generating dynamical neuroimaging spatiotemporal representations (DyNeuSR) using topological data analysis. Network Neuroscience (2019) 3 (3), 763778
[81] A. Patania, P. Selvaggi, M. Veronese, O. Dipasquale, P. Expert, G. Petri, Topological gene expression networks recapitulate brain anatomy and function. Network Neuroscience (2019) 3 (3), 744762
[82] E. Ibáñez‐Marcelo, L. Campioni, D.et al.. Spectral and topological analyses of the cortical representation of the head position: Does hypnotizability matter? Brain and behavior (2018) 9 (6), e01277
[83] G. Petri, A. Barrat, Simplicial activity driven model, Physical review letters 121 (22), 228301
[84] A. Phinyomark, E. IbanezMarcelo, G. Petri. Restingstate fmri functional connectivity: Big data preprocessing pipelines and topological data analysis. IEEE Transactions on Big Data (2017) 3 (4), 415428
[85] G. Petri, S. Musslick, B. Dey, K. Ozcimder, D. Turner, N.K. Ahmed, T. Willke. Topological limits to parallel processing capability of network architectures. arXiv preprint (2017) arXiv:1708.03263
[86] K. Ozcimder, B. Dey, S. Musslick, G. Petri, N.K. Ahmed, T.L. Willke, J.D. Cohen, A Formal Approach to Modeling the Cost of Cognitive Control, arXiv preprint (2017) arXiv:1706.00085
[87] L.D. Lord, P. Expert, et al. , Insights into brain architectures from the homological scaffolds of functional connectivity networks, Frontiers in systems neuroscience (2016) 10, 85
[88] J. Binchi, E. Merelli, M. Rucco, G. Petri, F. Vaccarino. jHoles: A Tool for Understanding Biological Complex Networks via Clique Weight Rank Persistent Homology. Electron. Notes Theor. Comput. Sci. (2014) 306, 518
[89] G. Petri, P. Expert, F. Turkheimer, R. CarhartHarris, D. Nutt, P.J. Hellyer et al., Homological scaffolds of brain functional networks. Journal of The Royal Society Interface (2014) 11 (101), 20140873
[90] G. Petri, M. Scolamiero, I. Donato, F. Vaccarino, Topological strata of weighted complex networks. PloS one (2013) 8 (6), e66506
[91] G. Petri, M. Scolamiero, I. Donato, ., Networks and cycles: a persistent homology approach to complex networks Proceedings of the european conference on complex systems (2013), 9399Daniel Bennequin, JuanPablo Vigneaux, Olivier Peltre, Pierre Baudot and colleagues:
[92] D. Bennequin. G. SergeantPerthuis, O. Peltre, and J.P. Vigneaux, Extrafine sheaves and interaction decompositions, (2020) arXiv:2009.12646
[93] O. Peltre, MessagePassing Algorithms and Homology, PhD Thesis (2020), arXiv:2009.11631
[94] G. SergeantPerthuis, Interaction decomposition for presheafs, (2020) arXiv:2008.09029
[95] G. SergeantPerthuis, Bayesian/Graphoid intersection property for factorisation models, (2019), arXiv:1903.06026
[96] J.P. Vigneaux, Topology of Statistical Systems: A Cohomological Approach to Information Theory, PhD Thesis (2019).
[97] J.P. Vigneaux, Information structures and their cohomology, in Theory and Applications of Categories, Vol. 35, (2020), No. 38, pp 14761529.
[98] J.P. Vigneaux, Information theory with finite vector spaces, in IEEE Transactions on Information Theory, vol. 65, no. 9, pp. 56745687, Sept. (2019)
[99] Baudot P., Tapia M., Bennequin, D. , Goaillard J.M., Topological Information Data Analysis. (2019), Entropy, 21(9), 869
[100] Baudot P., The PoincaréShannon Machine: Statistical Physics and Machine Learning aspects of Information Cohomology. (2019), Entropy , 21(9),
[101] Tapia M., Baudot P., et al. Neurotransmitter identity and electrophysiological phenotype are genetically coupled in midbrain dopaminergic neurons. Scientific Reports. (2018). BioArXiv168740
[102] Baudot P., Elements of qualitative cognition: an Information Topology Perspective. Physics of Life Reviews. (2019) Arxiv. arXiv:1807.04520
[103] Baudot P., Bennequin D., The homological nature of entropy. Entropy, (2015), 17, 166; doi:10.3390
[104] D. Bennequin. Remarks on Invariance in the Primary Visual Systems of Mammals, pages 243–333. Neuromathematics of Vision Part of the series Lecture Notes in Morphogenesis Springer, 2014.
[105] Baudot P., Bennequin D., Information Topology I and II. Random models in Neuroscience (2012) 
GeoSciInfo
Max WELLING
Informatics Institute, University of Amsterdam and Qualcomm Technologies
https://staff.fnwi.uva.nl/m.welling/
ELLIS Board Member (European Laboratory for Learning and Intelligent Systems: https://ellis.eu/)
Title: Exploring Quantum Statistics for Machine Learning
Abstract: Quantum mechanics represents a rather bizarre theory of statistics that is very different from the ordinary classical statistics that we are used to. In this talk I will explore if there are ways that we can leverage this theory in developing new machine learning tools: can we design better neural networks by thinking about entangled variables? Can we come up with better samplers by viewing them as observations in a quantum system? Can we generalize probability distributions? We hope to develop better algorithms that can be simulated efficiently on classical computers, but we will naturally also consider the possibility of much faster implementations on future quantum computers. Finally, I hope to discuss the role of symmetries in quantum theories.
Reference:
Roberto Bondesan, Max Welling, Quantum Deformed Neural Networks, arXiv:2010.11189v1 [quantph], 21st October 2020 ; https://arxiv.org/abs/2010.11189Jean PETITOT
Directeur d'Études, Centre d'Analyse et de Mathématiques, Sociales, École des Hautes Études, Paris.
Born in 1944, Jean Petitot is an applied mathematician interested in dynamical modeling in neurocognitive sciences. He is the former director of the CREA (Applied Epistemology Research Center) at the Ecole Polytechnique.Philisopher of science http://jeanpetitot.com
Title : The primary visual cortex as a Cartan engine
Abstract: Cortical visual neurons detect very local geometric cues as retinal positions, local contrasts, local orientations of boundaries, etc. One of the main theoretical problem of low level vision is to understand how these local cues can be integrated so as to generate the global geometry of the images perceived, with all the wellknown phenomena studied since Gestalt theory. It is an empirical evidence that the visual brain is able to perform a lot of routines belonging to differential geometry. But how such routines can be neurally implemented ? Neurons are « pointlike » processors and it seems impossible to do differential geometry with them. Since the 1990s, methods of "in vivo optical imaging based on activitydependent intrinsic signals" have made possible to visualize the extremely special connectivity of the primary visual areas, their “functional architectures.” What we called « Neurogeometry » is based on the discovery that these functional architectures implement structures such as the contact structure and the subRiemannian geometry of jet spaces of plane curves. For reasons of principle, it is the geometrical reformulation of differential calculus from Pfaff to Lie, Darboux, Frobenius, Cartan and Goursat which turns out to be suitable for neurogeometry.
References:
 Agrachev, A., Barilari, D., Boscain, U., A Comprehensive Introduction to SubRiemannian Geometry, Cambridge University Press, 2020.
 Citti, G., Sarti, A., A cortical based model of perceptual completion in the rototranslation space, Journal of Mathematical Imaging and Vision, 24, 3 (2006) 307326.
 Petitot, J., Neurogéométrie de la vision. Modèles mathématiques et physiques des architectures fonctionnelles, Les Éditions de l'École Polytechnique, Distribution Ellipses, Paris, 2008.
 Petitot, J., “Landmarks for neurogeometry”, Neuromathematics of Vision, (G. Citti, A. Sarti eds), Springer, Berlin, Heidelberg, 185,
 Petitot,J., Elements of Neurogeometry. Functional Architectures of Vision, Lecture Notes in Morphogenesis, Springer, 2017.
 Prandi, D., Gauthier, J.P., A Semidiscrete Version of the Petitot Model as a Plausible Model for Anthropomorphic Image Reconstruction and Pattern Recognition, https://arxiv.org/abs/1704.03069v1, 2017.
Yvette KosmannSchwarzbach
Professeur des universités honoraire ; former student of the Ecole normale supérieure Sèvres, 19601964; aggregation of mathematics, 1963; CNRS research associate, 19641969; doctorate in science, Lie derivatives of spinors, University of Paris, 1970 under supervision of André Lichnerowicz; lecturer then professor at the University of Lille (19701976 and 19821993), at Brooklyn College, New York (19791982), at the École polytechnique (19932006)Title: Structures of Poisson Geometry: old and new
Abstract: How did the brackets that SiméonDenis Poisson introduce in 1809 evolve into the Poisson geometry of the 1970's? What are Poisson groups and, more generally, Poisson groupoids? In what sense does Dirac geometry generalize Poisson geometry and why is it relevant for applications? I shall sketch the definition of these structures and try to answer these questions.
References
 P. Libermann and C.M. Marle, Symplectic Geometry and Analytical Mechanics, D. Reidel Publishing Company (1987).
 J. E. Marsden and T. S. Ratiu, Introduction to Mechanics and Symmetry, Texts in Applied Mathematics 17, second edition, Springer (1998).
 C. LaurentGengoux, A. Pichereau, and P. Vanhaecke, Poisson Structures, Grundlehren der mathematischen Wissenschaften 347, Springer (2013).
 Y. KosmannSchwarzbach, Multiplicativity from Lie groups to generalized geometry, in Geometry of Jets and Fields (K. Grabowska et al., eds), Banach Center Publications 110, 2016.
 Special volume of LMP on Poisson Geometry, guest editors, Anton Alekseev, Alberto Cattaneo, Y. KosmannSchwarzbach, and Tudor Ratiu, Letters in Mathematical Physics 90, 2009.
 Y. KosmannSchwarzbach (éd.), SiméonDenis Poisson : les Mathématiques au service de la science, Editions de l'Ecole Polytechnique (2013).
 Y. KosmannSchwarzbach, The Noether Theorems: Invariance and Conservation Laws in the Twentieth Century, translated by B. E. Schwarzbach, Sources and Studies in the History of Mathematics and Physical Sciences, Springer (2011).
Michel Broniatowski
Sorbonne Université, Paris
Title: Some insights on statistical divergences and choice of models
Abstract: Divergences between probability laws or more generally between measures define inferential criteria, or risk functions. Their estimation makes it possible to deal with the questions of model choice and statistical inference, in connection with the regularity of the models considered; depending on the nature of these models (parametric or semiparametric), the nature of the criteria and their estimation methods vary. Representations of these divergences as large deviation rates for specific empirical measures allow their estimation in nonparametric or semi parametric models, by making use of information theory results (Sanov's theorem and Gibbs principles), by Monte Carlo methods. The question of the choice of divergence is wide open; an approach linking nonparametric Bayesian statistics and MAP estimators provides elements of understanding of the specificities of the various divergences in the AliSilveyCsiszarArimoto class in relation to the specific choices of the prior distributions.
References:
 Broniatowski, Michel ; Stummer, Wolfgang. Some universal insights on divergences for statistics, machine learning and artificial intelligence. In Geometric structures of information; Signals Commun. Technol., Springer, Cham, pp. 149.211, 2019
 Broniatowski, Michel. Minimum divergence estimators, Maximum Likelihood and the generalized bootstrap, to appear in "Divergence Measures: Mathematical Foundations and Applications in InformationTheoretic and Statistical Problems" Entropy, 2020
 Csiszár, Imre ; Gassiat, Elisabeth. MEM pixel correlated solutions for generalized moment and interpolation problems. IEEE Trans. Inform. Theory 45, no. 7, 2253–2270, 1999
 Liese, Friedrich; Vajda, Igor. On divergences and informations in statistics and information theory. IEEE Trans. Inform. Theory 52, no. 10, 4394–4412, 2006
Maurice de Gosson
Professor, Senior Researcher at the University of Vienna https://homepage.univie.ac.at/maurice.de.gosson
Faculty of Mathematics, NuHAG groupTitle: Gaussian states from a symplectic geometry point of view
Abstract: Gaussian states play an ubiquitous role in quantum information theory and in quantum optics because they are easy to manufacture in the laboratory, and have in addition important extremality properties. Of particular interest are their separability properties. Even if major advances have been made in their study in recent years, the topic is still largely open. In this talk we will discuss separability questions for Gaussian states from a rigorous point of view using symplectic geometry, and present some new results and properties.
References:
 M. de Gosson, On the Disentanglement of Gaussian Quantum States by Symplectic Rotations. C.R. Acad. Sci. Paris Volume 358, issue 4, 459462 (2020)
 M. de Gosson, On Density Operators with Gaussian Weyl symbols, In Advances in Microlocal and TimeFrequency Analysis, Springer (2020)
 M. de Gosson, Symplectic CoarseGrained Classical and Semiclassical Evolution of Subsystems: New Theoretical Aspects, J. Math. Phys. no. 9, 092102 (2020)
 E. Cordero, M. de Gosson, and F. Nicola, On the Positivity of Trace Class Operators, to appear in Advances in Theoretical and Mathematical Physics 23(8), 2061–2091 (2019)
 E. Cordero, M. de Gosson, and F. Nicola, A characterization of modulation spaces by symplectic rotations, to appear in J. Funct. Anal. 278(11), 108474 (2020)
Giuseppe LONGO
Centre Cavaillès, CNRS & Ens Paris and School of Medicine, Tufts University, Boston http://www.di.ens.fr/users/longo/Title: Use and abuse of "digital information" in life sciences, is Geometry of Information a way out?
Abstract: Since WWII, the war of coding, and the understanding of the structure of the DNA (1953), the latter has been considered as the digital encoding of the Aristotelian Homunculus. Till now DNA is viewed as the "information carrier" of ontogenesis, the main or unique player and pilot of phylogenesis. This heavily affected our understanding of life and reinforced a mechanistic view of organisms and ecosystems, a component of our disruptive attitude towards ecosystemic dynamics. A different insight into DNA as a major constraint to morphogenetic processes brings in a possible "geometry of information" for biology, yet to be invented. One of the challenges is in the need to move from a classical analysis of morphogenesis, in physical terms, to a "heterogenesis" more proper to the historicity of biology.
References
 Arezoo Islami, Giuseppe Longo. Marriages of Mathematics and Physics: a challenge for Biology, Invited Paper, in The Necessary Western Conjunction to the Eastern Philosophy of Exploring the Nature of Mind and Life (K. Matsuno et al., eds), Special Issue of Progress in Biophysics and Molecular Biology, Vol 131, Pages 179¬192, December 2017. (DOI) (SpaceTimeIslamiLongo.pdf)
 Giuseppe Longo. How Future Depends on Past Histories and Rare Events in Systems of Life, Foundations of Science, (DOI), 2017 (biologobservhistoryfuture.pdf)
 Giuseppe Longo. Information and Causality: Mathematical Reflections on Cancer Biology. In Organisms. Journal of Biological Sciences, vo. 2, n. 1, 2018. (BiologicalConseqofCompute.pdf)
 Giuseppe Longo. Information at the Threshold of Interpretation, Science as Human Construction of Sense. In Bertolaso, M., Sterpetti, F. (Eds.) A Critical Reflection on Automated Science – Will Science Remain Human? Springer, Dordrecht, 2019 (InformationInterpretation.pdf)
 Giuseppe Longo, Matteo Mossio. Geocentrism vs genocentrism: theories without metaphors, metaphors without theories. In Interdisciplinary Science Reviews, 45 (3), pp. 380405, 2020. (Metaphorsgeogenocentrism.pdf)

GeoSciInfo
Welcome to “Geometric Science of Information” 202 Conference
On behalf of both the organizing and the scientific committees, it is our great pleasure to welcome all delegates, representatives and participants from around the world to the fifth International SEE conference on “Geometric Science of Information” (GSI’21), sheduled in July 2021.
GSI’21 benefits from scientific sponsor and financial sponsors.
The 3day conference is also organized in the frame of the relations set up between SEE and scientific institutions or academic laboratories such as Ecole Polytechnique, Ecole des Mines ParisTech, INRIA, CentraleSupélec, Institut Mathématique de Bordeaux, Sony Computer Science Laboratories.
The GSI conference cycle has been initiated by the Brillouin Seminar Team as soon as 2009. The GSI’21 event has been motivated in the continuity of first initiatives launched in 2013 (https://www.see.asso.fr/gsi2013) at Mines PatisTech, consolidated in 2015 (https://www.see.asso.fr/gsi2015) at Ecole Polytechnique and opened to new communities in 2017 (https://www.see.asso.fr/gsi2017) at Mines ParisTech and 2019 (https://www.see.asso.fr/gsi2019) at ENAC Toulouse. We mention that in 2011, we organized an indofrench workshop on “Matrix Information Geometry” that yielded an edited book in 2013, and in 2017, collaborate to CIRM seminar in Luminy TGSI’17 “Topoplogical & Geometrical Structures of Information” (http://forum.csdc.org/category/94/tgsi2017). Last GSI’19 Proceedings have been edited by SPRINGER in Lecture Notes (https://www.springer.com/gp/book/9783030269791).
GSI satellites event have been organized in 2019 and 2020 as, FGSI’19 “Foundation of Geometric Science of Information” in Montpellier and Les Houches Seminar SPIGL’20 “Joint Structures and Common Foundations of Statistical Physics, Information Geometry and Inference for Learning” .
The technical program of GSI’21 covers all the main topics and highlights in the domain of “Geometric Science of Information” including Information Geometry Manifolds of structured data/information and their advanced applications. This proceedings consists solely of original research papers that have been carefully peerreviewed by two or three experts before, and revised before acceptance.
Historical background
As for the GSI’13, GSI’15, GSI’17, and GSI’19 GSI'21 addresses interrelations between different mathematical domains like shape spaces (geometric statistics on manifolds and Lie groups, deformations in shape space, ...), probability/optimization & algorithms on manifolds (structured matrix manifold, structured data/Information, ...), relational and discrete metric spaces (graph metrics, distance geometry, relational analysis,...), computational and Hessian information geometry, geometric structures in thermodynamics and statistical physics, algebraic/infinite dimensional/Banach information manifolds, divergence geometry, tensorvalued morphology, optimal transport theory, manifold & topology learning, ... and applications like geometries of audioprocessing, inverse problems and signal/image processing. GSI’21 topics were enriched with contributions from Lie Group Machine Learning, Harmonic Analysis on Lie Groups, Geometric Deep Learning, Geometry of Hamiltonian Monte Carlo, Geometric & (Poly)Symplectic Integrators, Contact Geometry & Hamiltonian Control, Geometric and structure preserving discretizations, Probability Density Estimation & Sampling in High Dimension, Geometry of Graphs and Networks and Geometry in Neuroscience & Cognitive Sciences.
At the turn of the century, new and fruitful interactions were discovered between several branches of science: Information Science (information theory, digital communications, statistical signal processing,), Mathematics (group theory, geometry and topology, probability, statistics, sheaves theory,...) and Physics (geometric mechanics, thermodynamics, statistical physics, quantum mechanics, ...). GSI conference cycle is a tentative to discover joint mathematical structures to all these disciplines by elaboration of a “General Theory of Information” embracing physics science, information science, and cognitive science in a global scheme.
Frank Nielsen, cochair : Ecole Polytechnique, Palaiseau, France, Sony Computer Science Laboratories, Tokyo, Japan
Frédéric Barbaresco, cochair: President of SEE ISIC Club (Ingénierie des Systèmes d'Information et de Communications),
Representative of KTD PCC (Key Technology Domain / Processing, Computing & Cognition ) Board, THALES LAND & AIR SYSTEMS, France

GeoSciInfo
As for GSI’13, GSI’15, GSI’17 and GSI’19, the objective of this SEE GSI’21 conference, hosted in PARIS, is to bring together pure/applied mathematicians and engineers, with common interest for Geometric tools and their applications for Information analysis.
It emphasizes an active participation of young researchers to discuss emerging areas of collaborative research on “Geometric Science of Information and their Applications”.
Current and ongoing uses of Information Geometry Manifolds in applied mathematics are the following: Advanced Signal/Image/Video Processing, Complex Data Modeling and Analysis, Information Ranking and Retrieval, Coding, Cognitive Systems, Optimal Control, Statistics on Manifolds, Topology/Machine/Deep Learning, Artificial Intelligence, Speech/sound recognition, natural language treatment, Big Data Analytics, Learning for Robotics, etc., which are substantially relevant for industry.
The Conference will be therefore held in areas of topics of mutual interest with the aim to:
• Provide an overview on the most recent stateoftheart
• Exchange mathematical information/knowledge/expertise in the area
• Identify research areas/applications for future collaborationProvisional topics of interests:
 Geometric Deep Learning (ELLIS session)
 Probability on Riemannian Manifolds
 Optimization on Manifold
 Shape Space & Statistics on nonlinear data
 Lie Group Machine Learning
 Harmonic Analysis on Lie Groups
 Statistical Manifold & Hessian Information Geometry
 Monotone Embedding in Information Geometry
 Nonparametric Information Geometry
 Computational Information Geometry
 Distance and Divergence Geometries
 Divergence Statistics
 Optimal Transport & Learning
 Geometry of Hamiltonian Monte Carlo
 Statistics, Information & Topology
 Graph Hyperbolic Embedding & Learning
 Inverse problems: Bayesian and Machine Learning interaction
 Integrable Systems & Information Geometry
 Geometric structures in thermodynamics and statistical physics
 Contact Geometry & Hamiltonian Control
 Geometric and structure preserving discretizations
 Geometric & Symplectic Methods for Quantum Systems
 Geometry of Quantum States
 Geodesic Methods with Constraints
 Probability Density Estimation & Sampling in High Dimension
 Geometry of TensorValued Data
 Geometric Mechanics
 Geometric Robotics & Learning
 Topological and geometrical structures in neurosciences
A special session will deal with:
 Geometric Structures Coding & Learning Libraries (geomstats, pyRiemann , Pot…)
Advanced information on article submission and publication
As for previous editions, GSI’21 Proceedings will be published in SPRINGER LNCS. See GSI’19 Proceedings
8 pages SPRINGER LNCS format is required for Initial paper submission.
A detailed call for contributions will be published shortly.
CALL FOR PAPERS 
GeoSciInfo
Ph.D. and Postdoc positions in Applied Mathematics
FriedrichAlexanderUniversität ErlangenNürnberg, Erlangen, Germany
Application deadline: January 10th, 2021
The Group
The Chair of Applied Analysis – Alexander von Humboldt Professorship at the Department of Mathematics of the FriedrichAlexanderUniversität ErlangenNürnberg (FAU), in Erlangen (Germany), led by Prof. Dr. Enrique Zuazua, is looking for outstanding candidates to fill several
Ph.D. and Postdoctoral positions
In the broad area of Applied Mathematics, the Chair develops and applies methods of Analysis, Computational Mathematics and Data Sciences to model, understand and control the dynamics of various phenomena arising in the interphase of Mathematics with Engineering, Physics, Biology and Social Sciences.
We welcome applications by young and highly motivated scientists to contribute to this exciting joint AvHFAU effort. Possible research projects include but are not limited to:
 Analysis of Partial Differential Equations (PDE).
 The interplay between Data Sciences, numerics of PDE and Control Systems.
 Control of diffusion models arising in Biology and Social Sciences.
 Modelling and control of multiagent systems.
 Hyperbolic models arising in traffic flow and energy transport.
 Waves in networks and Markov chains.
 Fractional PDE.
 Optimal design in Material Sciences.
 Micromacro limit processes.
 The interplay between discrete and continuous modelling in design and control.
 The emergence of turnpike phenomena in longtime horizons.
 Inversion and parameter identification.
 Recommendation systems.
 Development of new computation tools and software.
We look for excellent candidates with expertise in the areas of applied mathematics, PDE analysis, control theory, numerical analysis, data sciences and computational mathematics who enjoy interdisciplinary work.
The Chair contributes to the development of a new Center of Research at FAU, in the broad area of “Mathematics of Data”, conceived as a highly visible interdisciplinary research site, an incubator for future collaborative research grants and a turntable for the key research priorities of FAU. The recruited candidates will have the added opportunity to participate in this challenging endeavour.
How to apply
Applications, including cover/motivation letter, curriculum vitae, list of publications, statement of research and two or three names of experts for reference should be submitted via email as a single pdf file to secretaryaa[at]math.fau.de before January 10th, 2012.
Any inquiries about the positions should be sent to Prof. Enrique Zuazua (positionsaa[at]math.fau.de). Applications will be accepted until the positions are filled.
FAU is a member of “The Family in Higher Education Institutions” best practice club and also aims to increase the number of women in scientific positions. Female candidates are therefore particularly encouraged to apply. In case of equal qualifications, candidates with disabilities will take precedence.
For more detailed information about the Chair, please visit Chair of Applied Analysis – Alexander von Humboldt Professorship

GeoSciInfo
GeomLoss : Geometric Loss functions between sampled measures, images and volumes
Find all the docs and tutorials of the version 0.2.3 in the read the docs website:
N.B.: This is still an alpha release! Please send me your feedback: I will polish the user interface, implement Hausdorff divergences, add support for meshes, images, volumes and clean the documentation over the summer of 2020.
The GeomLoss library provides efficient GPU implementations for:

Kernel norms (also known as Maximum Mean Discrepancies).

Hausdorff divergences, which are positive definite generalizations of the ICP loss, analogous to loglikelihoods of Gaussian Mixture Models.

Unbiased Sinkhorn divergences, which are cheap yet positive definite approximations of Optimal Transport (Wasserstein) costs.
These loss functions, defined between positive measures, are available through the custom PyTorch layers SamplesLoss, ImagesLoss and VolumesLoss which allow you to work with weighted point clouds (of any dimension), density maps and volumetric segmentation masks. Geometric losses come with three backends each:

A simple tensorized implementation, for small problems (< 5,000 samples).

A reference online implementation, with a linear (instead of quadratic) memory footprint, that can be used for finely sampled measures.

A very fast multiscale code, which uses an octreelike structure for largescale problems in dimension <= 3.
GeomLoss is a simple interface for cuttingedge Optimal Transport algorithms. It provides:
 Support for batchwise computations.
 Linear (instead of quadratic) memory footprint for large problems, relying on the KeOps library for mapreduce operations on the GPU.
 Fast kernel truncation for small bandwidths, using an octreebased structure.
 Logdomain stabilization of the Sinkhorn iterations, eliminating numeric overflows for small values of 𝜀
 Efficient computation of the gradients, which bypasses the naive backpropagation algorithm.
 Support for unbalanced Optimal Transport, with a softening of the marginal constraints through a maximum reach parameter.
 Support for the εscaling heuristic in the Sinkhorn loop, with kernel truncation in dimensions 1, 2 and 3. On typical 3D problems, our implementation is 50100 times faster than the standard SoftAssign/Sinkhorn algorithm.
Note, however, that SamplesLoss does not implement the Fast Multipole or Fast Gauss transforms. If you are aware of a wellpackaged implementation of these algorithms on the GPU, please contact me!
The divergences implemented here are all symmetric, positive definite and therefore suitable for measurefitting applications. For positive input measures 𝛼 and 𝛽, our Loss
functions are such that
Loss(𝛼,𝛽) = Loss(𝛽,𝛼),
0 = Loss(𝛼,𝛼) ⩽ Loss(𝛼,𝛽),
0 = Loss(𝛼,𝛽) ⟺ 𝛼=𝛽.GeomLoss can be used in a wide variety of settings, from shape analysis (LDDMM, optimal transport…) to machine learning (kernel methods, GANs…) and image processing. Details and examples are provided below:
GeomLoss is licensed under the MIT license.
Author and Contributors
Feel free to contact us for any bug report or feature request:
 Jean Feydy
 Pierre Roussillon (extensions to brain tractograms and normal cycles)
Related projects
You may be interested by:

The KeOps library, which provides efficient CUDA routines for point cloud processing, with full PyTorch support.

Rémi Flamary and Nicolas Courty’s Python Optimal Transport library, which provides a reference implementation of OTrelated methods for small problems.

Bernhard Schmitzer’s Optimal Transport toolbox, which provides a reference multiscale solver for the OT problem, on the CPU.
