
GeoSciInfo
"Energy and entropy concepts to characterize and understand brain activity"
Organizers: Alain Destexhe, Jennifer Goldman, Mavi SanchezVives, Pier Stanislao Paolucci, Chiara DeLuca, Cristiano Capone, TrangAnh NghiemAbstract: Brain circuits in vivo or in vitro display a wide variety of activity states, and they may also respond to external inputs differently in each state. The goal of this workshop is to evaluate and compare tools to understand both the activity state, and its responsiveness, based on energy and entropy concepts commonly used in physics. We will review different ways of defining relevant measures of energy and entropy, and how they apply to analyze brain activity, including brain states and cognitive tasks, and what useful information can be extracted to better understand the system. We will also evaluate the opportunity of writing a collaborative paper that reviews the different techniques and their usefulness.
List of confirmed speakers:
 Pierre Baudot (U Marseille)
 Alessandra Camassa (U Barcelona)
 Cristiano Capone (INFN, Rome)
 Chiara Cirelli (U Wisconsin)
 Chiara de Luca (INFN, Rome)
 Athena Demertzi (Universite de Liege)
 Alain Destexhe (CNRS)
 Adrienne Fairhall (U Washington)
 Karl Friston (UCL)
 Jennifer Goldman (CNRS)
 Viktor Jirsa (AMU)
 Daniele Marinazzo (Ghent U)
 Olivier Marre (INSERM)
 Maurizio Mattia (ISN Roma)
 Thierry Mora (ENS)
 TrangAnh Nghiem (ENS)
 PierStanislao Paolucci (U Roma)
 Antonio Pazienti (ISN Roma)
 Mavi SanchezVives (U Barcelona)
 Simone Sarasso (U Milan)
 Elad Schneidman (Weizmann Inst)
Updated program:
Day 1  May 5
Session 1: Theoretical aspects of energy and entropy I
 13h: general intro (Alain) and tribute to Paolo Del Giudice
 13h1513h40: Alain Destexhe (CNRS, ParisSaclay U): "Energy and Entropybased methods to characterize brain states" VIDEO
 13h4014h05: Chiara de Luca (INFN, Rome): "DeepSleep Memory Densitybasedclustering: Theoretical Framework and Energetic/Entropic Effects on Awake Activity"
 14h0514h30: Thierry Mora (ENS, Paris): "Sensory adaptation and entropy production"
 14h3014h40: general discussion
 14h4015h coffee break
Session 2: Applications to neural data I
 15h15h25: Simone Sarasso (U Milan): "Assessing brain states through complexity: converging empirical evidence"
 15h2515h50: Athena Demetrzi (U Liege): "Cerebral configuration in states of reduced reportability"
 15h5016h15: Mavi SanchezVives (IDIBAPS, Barcelona): "Cellular and synaptic contributions to cortical complexity in different brain states"
 16h1516h40: Chiara Cirelli (U Wisconsin): "Sleep and synaptic downselection" VIDEO
 16h4016h50: general discussion
 16h5017h10 coffee break
Session 3: Theoretical aspects of energy and entropy II
 17h1017h35: Daniele Marinazzo (Ghent U): "Beyond pairwise interactions: expanding the transfer entropy and the mutual information" VIDEO
 17h3518h: TrangAnh Nghiem (ENS, Paris): "Unveiling the correlation structure supporting brain states and neural codes with maximum entropy models" VIDEO
 18h18h25: Jennifer Goldman (CNRS, ParisSaclay U): "A scaleintegrated approach to brain states; from single neuron biophysics to macroscopic neural dynamics" VIDEO
 18h3018h40: general discussion
DAY 2  May 6
Session 4: Theoretical aspects of energy and entropy III
 13h13h25: Elad Schneidman (Weizmann Inst): "Learning the code of large neural populations by sparse nonlinear random projections"
 13h2513h50: Pierre Baudot (AMU Marseille): "Topological information and higher order statistical structures: mathematical foundations of information and deep learning" VIDEO
 13h5014h15: Maurizio Mattia & Antonio Pazienti (ISN Roma): "Insights in the transition from slowwave activity to wakefulness using an entropybased index" VIDEO
 14h1514h25: general discussion
14h2514h45 coffee break
Session 5: Applications to neural data II
 14h4515h10: Olivier Marre (INSERM, Paris): "Maximum entropy models in the retina"
 15h1015h35: Cristiano Capone (INFN, Rome): "Simulations approaching data: Cortical slow waves in inferred models of the whole hemisphere of mouse" VIDEO
 15h3516h: Alessandra Camassa (IDIBAPS, Barcelona): "Energybased hierarchical clustering of cortical slow waves in multi electrode recordings" VIDEO
 16h16h10: general discussion
 16h1016h30 coffee break
Session 6: Theoretical aspects of energy and entropy IV

GeoSciInfo
Topological Data Analysis and Information Theory (online)
Lecture Series
OFFICIAL WEBSITE
It is our pleasure to invite you to attend the lecture series on Topological Data Analysis and Information Theory, organised by IAS fellow Fernando Nobrega Santos and Rick Quax.
REGISTRATIONEvent details of Topological Data Analysis and Information Theory (online)
Date: Tuesday 29th June 2021 and monday 5th July 2021
Time: 14:00 17:30
Organised by Fernando Nobrega Santos , Rick QuaxHighorder interactions are interactions that go beyond a sequence of pairwise interactions. Multiple approaches exist that aim to detect and quantify highorder interactions that are qualitatively different. Two of the most prominent approaches are topological data analysis (TDA) and information theory (IT). Central questions addressed in this lecture series are: what do these two approaches have in common? How can they complement each other? And what could they bring to application domains, especially in neuroscience?
Programme
Tuesday 29 June 2021 14:0014:10 Opening by IAS
 14.1015.10 Lecture by Herbert Edelsbrunner: TDA for information theoretically motivated distances
 15.1015.20 Break
 15:2016:20 Lecture by Giovanni Petri: Social contagion and norm emergence on simplicial complexes and hypergraphs
 16.2016.30 Break
 16:3017:30 Lecture by Chad Giusti: A brief introduction to topological neuroscience
Monday 5 July 2021
 14:0014:10 Opening by IAS
 14.1015.10 Lecture by Rick Quax (UvA  IAS) Title: Brief introduction to information theory and the concept(s) of synergy
 15.1015.20 Break
 15:2016:20 Lecture by Fernando Rosas (Imperial College UK) Title: Towards a deeper understanding of highorder interdependencies in complex systems
 16.2016.30 Break
 16:3017:30 Lecture by Pierre Baudot (Median Technologies– France) Title: Information is Topology
Each lecture will be 50 min, followed by Q&A. To participate, register below.
Tuesday 29 June 2021
 First lecture Title: TDA for information theoretically motivated distances
Speaker: Herbert Edelsbrunner (IST Austria) VIDEO
Abstract: Given a finite set in a metric space, the topological analysis assesses its multiscale connectivity quantified in terms of a $1$parameter family of homology groups. Going beyond metrics, we show that the basic tools of topological data analysis also apply when we measure dissimilarity with Bregman divergences. A particularly interesting case is the relative entropy whose infinitesimal version is known as the Fisher information metric. It relates to the Euclidean metric on the sphere and, perhaps surprisingly, the discrete Morse properties of random data behaves the same as in Euclidean space.
Short bio: Herbert Edelsbrunner graduated in 1982 from the Graz University of Technology. He worked in Austria from 1982 to 85, in Illinois from 1985 to 99, and in North Carolina from 1999 to 2012, before joining IST Austria in 2009. He received the Waterman Award from the NSF in 1991 and the Wittgenstein Prize from the FWF in 2018. He is a member of Academies of Sciences in the US, in Germany, and in Austria. His primary research area is computational geometry and topology. His research focus is on computational geometry and topology. http://pub.ist.ac.at/~edels/
 Second lecture Title: Social contagion and norm emergence on simplicial complexes and hypergraphs
Speaker: Giovanni Petri (ISI Italy)
Abstract: Complex networks have been successfully used to describe dynamical processes of social and biological importance. Two classic examples are the spread of diseases and the emergence of shared norms in populations of networked interacting individuals. However, pairwise interactions are often not enough to fully characterize contagion or coordination processes, where influence and reinforcement are at work. Here we present recent results on the higherorder generalization of the SIS process and of the naming game. First, we numerically show that a higherorder contagion model displays novel phenomena, such as a discontinuous transition induced by higherorder interactions. We show analytically that the transition is discontinuous and that a bistable region appears where healthy and endemic states coexist. Our results help explain why critical masses are required to initiate social changes and contribute to the understanding of higherorder interactions in complex systems. We then turn to the naming game as a prototypical example of norm emergence and show that higherorder interactions can create interesting novel phenomenologies, for example they can explain how when communication among agents is inefficient even very small committed minorities are able to bring the system to a tipping point and flip the majority in the system. We conclude with an outlook on higherorder model, posing new questions and paving the way for modeling dynamical processes on these networks.
Short Bio: Giovanni Petri is a Senior Research Scientist at ISI Foundation in Italy, working on topological approaches to complex networks and their underlying geometry, with special attention to the topology of brain structure and dynamics.
 Third Lecture Title: A brief introduction to topological neuroscience
Speaker: Chad Giusti (University of Delaware  USA)
Abstract: Algebraic topology has the potential to become a fundamental tool in theoretical neuroscience, building on the foundations laid by network neuroscience, natively incorporating higherorder structure and a rich mathematical tool kit for describing qualitative structure in systems. In this talk I will briefly survey how topological methods have been applied to problems in neuroscience, then briefly discuss current directions and a few major challenges I see for the field.
Monday 5 July 2021
 First lecture: Title: Brief introduction to information theory and the concept(s) of synergy
Speaker: Rick Quax (UvA  IAS) VIDEO
Abstract: Quantifying synergy among stochastic variables is an important open problem in information theory. Information synergy occurs when multiple sources together predict an outcome variable better than the sum of singlesource predictions. It is an essential phenomenon in biology such as in neuronal networks and cellular regulatory processes, where different information flows integrate to produce a single response, but also in social cooperation processes as well as in statistical inference tasks in machine learning. Here we propose a metric of synergistic entropy and synergistic information from first principles. Rick will start with a brief, needtoknow introduction of key informationtheoretic notions (entropy, mutual information) and then move on to introducing the concept of synergistic information. He will highlight a few intuitive ways of trying to quantify synergistic information that exist today, including PID, a geometric approach, and his own proposed method.
Short bio: Rick’s ambition is to study Complex Adaptive Systems with a focus on emergent information processing in dynamical systems. He is trying to span the spectrum from theoretical foundations to application domains, ensuring that new theory insights have direct impact on applicationoriented research and vice versa. He is currently Assistant Professor in the Computational Science Lab at the University of Amsterdam and member of IAS.
 Second lecture: Title: Towards a deeper understanding of highorder interdependencies in complex systems
Speaker: Fernando Rosas (Imperial College UK) VIDEO
Abstract: We live in an increasingly interconnected world and, unfortunately, our understanding of interdependency is still rather limited. As a matter of fact, while bivariated relationships are at the core of most of our data analysis methods, there is still no principled theory to account for the different types of interactions that can occur between three or more variables. This talk explores the vast and largely unexplored territory of multivariate complexity, and discusses informationtheoretic approaches that have been recently introduced to fill this important knowledge gap.
The first part of the talk is devoted to synergistic phenomena, which correspond to statistical regularities that affect the whole but not the parts. We explain how synergy can be effectively captured by informationtheoretic measures inspired in the nature of high brain functions, and how these measures allow us to map complex interdependencies into hypergraphs. The second part of the talk focuses on a new theory of what constitutes causal emergence and how it can be measured from time series data. This theory enables a formal, quantitative account of downward causation, and introduces “causal decoupling” as a complementary modality of emergence. Importantly, this not only establishes conceptual tools to frame conjectures about emergence rigorously, but also provides practical procedures to test them on data. We illustrate the considered analysis tools on different case studies, including cellular automata, baroque music, flocking models, and neuroimaging datasets.
Short Bio: Fernando Rosas received the B.A. degree in music composition and minor degree in philosophy (2002), the B.Sc. degree in mathematics (2006), and the M.S. and Ph.D. degree in engineering sciences from the Pontificia Universidad Católica de Chile (PUC, 2012). He worked as postdoctoral researcher at KU Leuven (Belgium), the National Taiwan University (Taiwan), and Imperial College London (UK). He received the “Academic Award” given by the Department of Mathematics of the PUC for having the best academic performance of his promotion and was the recipient of a CONICYT Doctoral Fellowship from the Chilean Ministry of Education (2008), a “F+” Scholarship from KU Leuven (2014), and a Marie SłodowskaCurie Individual Fellowship from the European Union (2017). He is currently working as Postdoctoral Researcher at the Data Science Institute and the Centre for Psychedelic Research at Imperial College London. His research interests lay in the interface between data science & AI, complexity science, cognitive science, and neuroscience.
 Third Lecture Title: Information is Topology
Speaker: Pierre Baudot (Median Technologies– France) VIDEO
Abstract: Information theory, probability and statistical dependencies, and algebraic topology provide different views of a unified theory yet currently in development, where uncertainty goes as deep as Galois's ambiguity theory, topos and motivs. I will review some foundations led notably by Bennequin and Vigneaux, that characterize uniquely entropy as the first group of cohomology, on random variable complexes and probability laws. This framework allows to retrieve most of the usual information functions, like KL divergence, cross entropy, Tsallis entropies, differential entropy in different generality settings. Multivariate interaction/Mutual information (I_k and J_k) appear as coboundaries, and their negative minima, also called synergy, corresponds to homotopical link configurations, which at the image of Borromean links, illustrate what purely collective interactions or emergence can be. Those functions refine and characterize statistical independence in the multivariate case, in the sens that (X1,...,Xn) are independent iff all the I_k=0 (with 1<k<n+1, whereas for Total correlations G_k, it is sufficient that G_n=0), generalizing correlation coefficient. Concerning data analysis, restricting to the simplicial random variable structure subcase, the application of the formalism to genetic transcription or to some classical benchmark dataset using open access infotopo library, unravels that higher statistical interactions are nonetheless omnipresent but also constitutive of biologically relevant assemblies. On the side of Machine learning, information cohomology provides a topological and combinatorial formalization of deep networks' supervised and unsupervised learning, where the depth of the layers is the simplicial dimension, derivationpropagation is forward (cohomological).
Short bio: Pierre Baudot was graduated in 1998 from Ecole Normale Supérieure Ulm magister of biology, and passed his PhD in electrophysiology of visual perception studying learning information coding in natural condition. He started to develop information topology with Daniel Bennequin at Complex System Institute and Mathematical Institute of Jussieu from 2006 to 2013, and then at the Max Planck Institute for Mathematic in the Science at Leipzig. He then joined Inserm at Marseille to develop data applications notably to transcriptomics. Since 2018, he works at Median Technologies, a medical imaging AI company, to detect and predict cancers from CT scans. He received the K2 trophy (mathematics and applications 2017), and best entropy paper prize 2019 for his contributions to topological information data analysis.

GeoSciInfo
Toposes online
ORIGINAL WEBPAGE  OFFICIAL WEBSITE
Topos theory can be regarded as a unifying subject within Mathematics; in the words of Grothendieck, who invented the concept of topos, “It is the theme of toposes which is this “bed”, or this “deep river”, in which come to be married geometry and algebra, topology and arithmetic, mathematical logic and categorytheory, the world of the continuous and that of the “discontinuous” or “discrete” structures. It is what I have conceived of most broad, to perceive with finesse, by the same language rich of geometric resonances, an “essence” which is common to situations most distant from each other”.
The event "Toposes online" represents the third edition of the main international conference on topos theory, following the previous ones "Topos à l’IHES" and "Toposes in Como".
The format of the event is the same as that of the other two editions: it will consist of a threeday school, offering introductory courses for the benefit of students and mathematicians who are not already familiar with topos theory, followed by a threeday congress featuring both invited and contributed presentations on new theoretical advances in the subject as well as applications of toposes in different fields such as algebra, topology, number theory, algebraic geometry, logic, homotopy theory, functional analysis, and computer science.
The main aim of this conference series is to celebrate the unifying power and interdisciplinary applications of toposes and encourage further developments in this spirit, by promoting exchanges amongst researchers in different branches of mathematics who use toposes in their work and by introducing a new generation of scholars to the subject.
Because of the pandemic, this edition of the conference will take place entirely online. The participants may take advantage of the associated forum to discuss with each other (please register to it if you wish to post messages).
School lecturers
 Olivia Caramello (University of Insubria and IHES)
 Laurent Lafforgue (IHES)
 Charles Rezk (University of Illinois)
Invited speakers
 Samson Abramsky (University of Oxford)
 JeanClaude Belfiore (Huawei)
 Daniel Bennequin (University of Paris 7)
 Dustin Clausen (University of Copenhagen)
 Jens Hemelaer (University of Antwerp)
 Luca Prelli (University of Padua)
 Peter Scholze (University of Bonn)
 Ivan Tomasic (Queen Mary University of London)
Scientific and Organizing Committee
 Olivia Caramello
 Alain Connes
 Laurent Lafforgue
Sponsors
We gratefully acknowledge IHES and the University of Insubria for their support; in particular, the videos of "Toposes online" will be made available through the IHES YouTube channel.
Programme
Minicourses:
 Olivia Caramello: "Introduction to sheaves, stacks and relative toposes" VIDEO 1, VIDEO 2, VIDEO 3, VIDEO 4
Abstract: This course provides a geometric introduction to (relative) topos theory.
The first part of the course will describe the basic theory of sheaves on a site, the main structural properties of Grothendieck toposes and the way in which morphisms between toposes are induced by suitable kinds of functors between sites.
The second part, based on joint work with Riccardo Zanfa, will present an approach to relative topos theory (i.e. topos theory over an arbitrary base topos) based on stacks and a suitable notion of relative site.
 Laurent Lafforgue: "Classifying toposes of geometric theories" VIDEO 1, VIDEO 2, VIDEO 3, VIDEO 4
Abstract: The purpose of these lectures will be to present the theory of classifying toposes of geometric theories. This theory was developped in the 1970's by Lawvere, Makkai, Reyes, Joyal and other catagory theorists, systematising some constructions of Grothendieck and his student Monique Hakim, but it still deserves to be much better known that it actually is.
The last part of the lectures will present new developpments due to Olivia Caramello which, based on her principle of "toposes as bridges", make the theory of classifying toposes more applicable to concrete mathematical situations : in particular, the equivalence between geometric provability and computing on Grothendieck topologies, and general criteria for a theory to be of presheaf type.
 Charlez Rezk: "Higher Topos Theory" VIDEO 1, VIDEO 2, VIDEO 3, VIDEO 4
Abstract: In this series of lectures I will give an introduction to the concept of "infinity topoi", which is an analog of the notion of a "Grothendieck topos" which is not an ordinary category, but rather is an "infinity category".
No prior knowledge of higher category theory will be assumed.
Invited talks:
 Samson Abramsky: "The sheaftheoretic structure of contextuality and nonlocality" VIDEO
Abstract: Quantum mechanics implies a fundamentally nonclassical picture of the physical world. This nonclassicality is expressed in it sharpest form in the phenomena of nonlocality and contextuality, articulated in the Bell and KochenSpecker theorems. Apart from the foundational significance of these ideas, they play a central role in the emerging field of quantum computing and information processing, where these nonclassical features of quantum mechanics are used to obtain quantum advantage over classical computational models. The mathematical structure of contextuality, with nonlocality as a special case, is fundamentally sheaftheoretic. The nonexistence of classical explanations for quantum phenomena corresponds precisely to the nonexistence of certain global sections. This leads to both logical and topological descriptions of these phenomena, very much in the spirit of topos theory.
This allows the standard constructions which witness these results, such as KochenSpecker paradoxes, the GHZ construction, Hardy paradoxes, etc., to be visualised as discrete bundles. The nonclassicality appears as a logical twisting of these bundles, related to classical logical paradoxes, and witnessed by the nonvanishing of cohomological sheaf invariants. In this setting, a general account can be given of Bell inequalities in terms of logical consistency conditions. A notion of simulation between different experimental situations yields a category of empirical models, which can be used to classify the expressive power of contextuality as a resource. Both quantitative and qualitative, and discrete and continuous features arise naturally.
 JeanClaude Belfiore: "Beyond the statistical perspective on deep learning, the toposic point of view: Invariance and semantic information" (joint work with Daniel Bennequin) VIDEO
Abstract: The last decade has witnessed an experimental revolution in data science and machine learning, essentially based on two ingredients: representation (or feature learning) and backpropagation. Moreover the analysis of the behavior of deep learning is essentially done through the prism of probabilities. As long as artificial neural networks only capture statistical correlations between data and the tasks/questions that have to be performed/answered, this analysis may be enough. Unfortunately, when we aim at designing neural networks that behave more like animal brains or even humans’ ones, statistics is not enough and we need to perform another type of analysis. By introducing languages and theories in this framework, we will show that the problem of learning is, first, a problem of adequacy between data and the theories that are expressed. This adequacy will be rephrased in terms of toposes. We will unveil the relation between the socalled “generalization” and a stack that models this adequacy between data and the tasks.
Finally a five level perspective of learning with neural networks will be given that is based on the architecture (base site), a presemantic (fibration), languages, theories and the notion of semantic information.
 Daniel Bennequin: "Topos, stacks, semantic information and artificial neural networks" (joint work with JeanClaude Belfiore) VIDEO
Abstract: Every known artificial deep neural network (DNN) corresponds to an object in a canonical Grothendieck’s topos; its learning dynamic corresponds to a flow of morphisms in this topos. Invariance structures in the layers (like CNNs or LSTMs) correspond to Giraud’s stacks. This invariance is supposed to be responsible of the generalization property, that is extrapolation from learning data under constraints. The fibers represent presemantic categories (Culioli, Thom), over which artificial languages are defined, with internal logics, intuitionist, classical or linear (Girard). Semantic functioning of a network is its ability to express theories in such a language for answering questions in output about input data. Quantities and spaces of semantic information are defined by analogy with the homological interpretation of Shannon’s entropy (P.Baudot and D.B.). They generalize the measures found by Carnap and BarHillel (1952). Amazingly, the above semantical structures are classified by geometric fibrant objects in a closed model category of Quillen, then they give rise to homotopical invariants of DNNs and of their semantic functioning. Intentional type theories (MartinLöf) organize these objects and fibrations between them. Information contents and exchanges are analyzed by Grothendieck’s derivators.
 Dustin Clausen: "Toposes generated by compact projectives, and the example of condensed sets" VIDEO
Abstract: The simplest kind of Grothendieck topology is the one with only trivial covering sieves, where the associated topos is equal to the presheaf topos. The next simplest topology has coverings given by finite disjoint unions. From an intrinsic perspective, the toposes which arise from such a topology are exactly those which, as a category, have the useful property that they are generated by compact projective objects. I will discuss some general aspects of this situation, then specialize to a specific example, that of condensed sets. This is joint work with Peter Scholze.
 Jens Hemelaer: "Toposes of presheaves on monoids as generalized topological spaces" VIDEO
Abstract: Various ideas from topology have been generalized to toposes, for example surjections and inclusions, local homeomorphisms, or the fundamental group. Another interesting concept, that is less wellknown, is the notion of a complete spread, that was brought from topology to topos theory by Bunge and Funk. We will discuss these concepts in the special case of toposes of presheaves on monoids. The aim is to gain geometric intuition about things that are usually thought of as algebraic.
Special attention will go to the underlying topos of the Arithmetic Site by Connes and Consani, corresponding to the monoid of nonzero natural numbers under multiplication. The topological concepts mentioned earlier will be illustrated using this topos and some of its generalizations corresponding to maximal orders.The talk will be based on joint work with Morgan Rogers and joint work with Aurélien Sagnier.
 Luca Prelli: "Sheaves on Ttopologies" VIDEO
Abstract: Let T be a suitable family of open subsets of a topological space X stable under unions and intersections. Starting from T we construct a (Grothendieck) topology on X and we consider the associated category of sheaves. This gives a unifying description of various constructions in different fields of mathematics.
 Peter Scholze: "Liquid vector spaces" VIDEO
Abstract: (joint with Dustin Clausen) Based on the condensed formalism, we propose new foundations for real functional analysis, replacing complete locally convex vector spaces with a variant of socalled pliquid condensed real vector spaces, with excellent categorical properties; in particular they form an abelian category stable under extensions. It is a classical phenomenon that local convexity is not stable under extensions, so one has to allow nonconvex spaces in the theory, and pliquidity is related to pconvexity, where 0<p<=1 is an auxiliary parameter. Strangely, the proof that the theory of pliquid vector spaces has the desired good properties proceeds by proving a generalization over a ring of arithmetic Laurent series.
 Ivan Tomasic:"A topostheoretic view of difference algebra"
Abstract: Difference algebra was founded by Ritt in the 1930s as the study of rings and modules with distinguished endomorphisms thought of as `difference operators’. Aiming to introduce cohomological methods into the subject, we view difference algebra as the study of algebraic objects in the topos of BN of difference sets, i.e., actions of the additive monoid N of natural numbers. Guided by the general principle that the Gequivariant algebraic geometry (where G is a group, monoid, groupoid or a category) should correspond to the relative algebraic geometry over the base topos BG, we develop difference algebraic geometry as relative algebraic geometry over the base topos BN. We extend the framework of Hakim’s 1970s monograph to include the theories of the fundamental group and the \’etale cohomology of relative schemes over a general base topos, and derive consequences in the difference case.
Contributed talks:
 Peter Arndt: “Ranges of functors and geometric classes via topos theory” VIDEO
 Georg Biedermann (joint work with Mathieu Anel, Eric Finster, and André Joyal): “Higher Sheaves" VIDEO
 Ivan Di Liberti: “Towards higher topology” VIDEO

Francesco Genovese (joint work with Julia Ramos González): “A derived GabrielPopescu Theorem for Tstructures via derived injectives” VIDEO

Matthias Hutzler: “Gluing classifying toposes” VIDEO
 Ming Ng (joint work with Steve Vickers): “Adelic Geometry via Topos Theory” VIDEO
 Rasekh Nima: “Every Elementary Higher Topos has a Natural Number Object”
 Axel Osmond (joint work with Olivia Caramello): “The overtopos at a model” VIDEO

GeoSciInfo
Company presentation
Since 2002, Median Technologies has been expanding the boundaries of the identification, interpretation, analysis and reporting of imaging data in the medical world. Our core activity is to develop advanced imaging software solutions and platforms for clinical drug development in oncology, diagnostic support, and cancer patient care. Our software solutions improve the management of cancer patients by helping to better identify pathologies, develop and select patientspecific therapies (precision medicine).The company employs a highlyqualified team and leverages its scientific, technical, medical, and regulatory expertise to develop innovative medical imaging analysis software based on Artificial Intelligence, cloud computing and big data. We are driven by our core values that are essential to us. These values define who we are, what we do, the way we do it, and what we, as Median, aspire to:
• Leading innovation with purpose
• Committing to quality in all we do
• Supporting our customers in achieving their goals
• Always remembering to put the patient firstToday, we are a team of 130+ people. Most of us are based at our HQ, in Sophia Antipolis (French Riviera) and we have a subsidiary in the US and another one in China. Our company is growing in a fulfilling international and multicultural environment.
Job description
In the context of our research and development in artificial intelligence applied to medical imaging, we are looking for: Data Science and Machine Learning Research Scientist M/FIntegrated into a multidisciplinary research and development team within the iBiopsy® project, you are a scientist in the research and development of innovative medical imaging solutions using machine learning and other AI methods.
Medical imaging is one of the fastest growing fields in machine learning. We are looking for an enthusiastic, dynamic, and organized Data Scientist with strong ML experience, excellent communication skills who will thrive at the heart of technological innovation.
Assignments
o Position under the supervision of Head of Data Scienceo Responsibilities:

You will apply your AI/ML/Deep Learning knowledge to develop innovative and robust biomarkers using data coming from medical imaging systems such as MRI and CT scanners and other data sources.

Your work will involve research and development of novel machine learning algorithms and systems. Being part of our frontend innovation organization, you will actively scout, keep track of, evaluate, and leverage disruptive technologies, and emerging industrial, academic and technological trends.

You will work closely with iBiopsy’s software development team as well as clinical science team.

In addition, you will transfer technology, and share insights and best practices across innovation teams. You will generate intellectual property for the company. You will be expected to author peer reviewed papers, present results at industry/scientific conferences.

We look at you to building breakthrough AIenabled imaging solutions leveraging cloud computing and apply supervised and unsupervised Machine Learning techniques to create value from the imaging and clinical data repositories generated by our medical research and pharmaceutical industry partners. These AI enabled systems and services go beyond image analysis to transform medical practice and drug development.
Profile required
o Education: PhD in in Mathematics, Computer Science or related fieldso Main skills and Experience required:
• Minimum 3 years of relevant work experience in (deep) machine learning
• Experience with Medical Imaging, CT/MRI, image signatures, large scale visual information retrieval, features selection
• Relevant experience with Python, DL frameworks (i.e. Pytorch) and standard packages such as Scikitlearn, Numpy, Scipy, Pandas
• SemiSupervised Learning, Selfsupervised Learning, Reinforcement Learning, Adversarial methods.
• Multimodal feature extraction
• Author on related research publication / conferences
• Strong experience with opensource technologies to accelerate innovationKnowledge:
• In depth technical knowledge of AI, deep learning and computer vision
• Strong fundamental knowledge of statistical data processing, regression techniques, neural networks, decision trees, clustering, pattern recognition, probability theory, stochastic systems, Bayesian inference, statistical techniques and dimensionality reductionAdditional qualities:
• Strong interpersonal, communication and presentation skills as well as ability to work in global team
• Fluent in written and oral English 

GeoSciInfo
Bourses doctorales : CNRS et University of Tokyo
Le CNRS et the University of Tokyo financeront des bourses doctorales en sciences humaines et sociales, intelligence artificielle, science quantique, changement climatique et biologie moléculaire et cellulaire. Date limite des candidatures le 22 avril 2021.
Pour candidater : https://international.cnrs.fr/wpcontent/uploads/2021/02/GuidelinesPhDJointprogramCNRSUTokyo1.pdf

GeoSciInfo
We offer in total 8 funded PhD positions associated with the graduate sat the interface between Cognitive Science, Machine learning, Computational Neuroscience, and AI. chool on "Computational Cognition » (https://www.comcocms.uniosnabrueck.de/en/open_positions.html),
The RTG Computational Cognition aims at reintegrating Cognitive Science and Artificial Intelligence. PhD students of the RTG will be educated in both subjects in order to combine the findings of these fields and thus to get a better understanding of human and machine intelligence. Research fields involved in the RTG are Neuroinformatics, NeuroBioPsychology, BioInspired Computer Vision, KnowledgeBased Systems, Cognitive Natural Language Processing & Communication, Cognitive Modeling, Artificial Intelligence, Psycho/Neurolinguistics, Computational Linguistics and Cognitive Computing.
The RTG focuses on the integration of two research fields. Further information on the RTG is available at www.comco.uniosnabrueck.de. Detailed information on the core areas of the offered PhD projects can be obtained from the spokesmen of the RTG, Prof. Dr. Gordon Pipa (gpipa[at]uniosnabrueck.de) and Prof. Dr. Peter König (pkoenig[at]uniosnabrueck.de).The RTG is incorporated into the Cognitive Science PhD program founded in 2002. PhD students of the RTG will take advantage of an interdisciplinary environment, which nevertheless focuses on a common research topic and offers a broad range of methodological synergies between the projects.
Required Qualifications:
Applicants are expected to have an academic degree (Master/Diploma), experience in at least one of the domains listed above, proven experience in interdisciplinary work as well as a good command of the English language.Osnabrück University is committed to helping working/studying parents balance their family and working lives.
Osnabrück University seeks to guarantee equality of opportunity for women and men and strives to correct any gender imbalance in its schools and departments.
If two candidates are equally qualified, preference will be given to the candidate with disability status.
Applications with the usual documentation should be submitted by email in a single PDF file to the director of the Institute of Cognitive Science, Prof. Dr. Gunther Heidemann (gheidema[at]uniosnabrueck.de) with a cc to office[at]ikw.uniosnabrueck.de no later than April 19, 2021.
To inquire additional information on for example specific research projects you can contact the coordinator Gabriela Pipa (gapipa[at]uos.de).

Professor and Chair of the Neuroinformatics Department
Dr. rer. nat. Gordon Pipa
Institute of Cognitive Science, Room 50/218
University of Osnabrueck
Wachsbleiche 27, 49090 Osnabrück, Germanytel. +49 (0) 5419692277
fax (private). +49 (0) 5405 500 80 98
home office. +49 (0) 5405 500 90 95
email: gpipa[at]uos.de
webpage: http://www.ni.uos.de
research gate: https://www.researchgate.net/profile/Gordon_Pipa/?ev=prf_act
linkedin: https://de.linkedin.com/in/gordonpipa47771539Personal Assistent and Secretary of the Neuroinformatic lab:
Anna Jungeilges
Tel. +49 (0)541 9692390
Fax +49 (0)541 9692246
Email: anna.jungeilges[at]uniosnabrueck.de
visit us on
http://www.facebook.com/CognitiveScienceOsnabruck
https://twitter.com/#!/CogSciUOS 
GeoSciInfo
Statistics, Information and Topology
Cochairs of the session:
 Michel N'Guiffo Boyom: Université Toulouse
 Pierre Baudot: Median (link)
This session will focus on the advances of information theory, probability and statistics in Algebraic Topology (see [156] bellow). The field is currently knowing an impressive development, both on the side of the categorical, homotopical, or topos foundations of probability theorie and statistics, and of the information functions characterisation in cohomology and homotopy theory.
Bliographicical references: (to be completed)
[1] Cencov, N.N. Statistical Decision Rules and Optimal Inference. Translations of Mathematical Monographs. 1982.
[2] Ay, N. and Jost, J. and Lê, H.V. and Schwachhöfer, L. Information geometry and sufficient statistics. Probability Theory and Related Fields 2015 PDF
[3] Cathelineau, J. Sur l’homologie de sl2 a coefficients dans l’action adjointe, Math. Scand., 63, 5186, 1988. PDF
[4] Kontsevitch, M. The 1+1/2 logarithm. Unpublished note, Reproduced in ElbazVincent & Gangl, 2002, 1995 PDF
[5] ElbazVincent, P., Gangl, H. On poly(ana)logs I., Compositio Mathematica, 130(2), 161214. 2002. PDF
[6] Tomasic, I., Independence, measure and pseudofinite fields. Selecta Mathematica, 12 271306. Archiv. 2006.
[7] Connes, A., Consani, C., Characteristic 1, entropy and the absolute point. preprint arXiv:0911.3537v1. 2009.
[8] Marcolli, M. & Thorngren, R. Thermodynamic Semirings, arXiv 10.4171 / JNCG/159, Vol. abs/1108.2874, 2011.
[9] Abramsky, S., Brandenburger, A., The Sheaftheoretic structure of nonlocality and contextuality, New Journal of Physics, 13 (2011). PDF
[10] Gromov, M. In a Search for a Structure, Part 1: On Entropy, unpublished manuscript, 2013. PDF
[11] McMullen, C.T., Entropy and the clique polynomial, 2013. PDF
[12] Marcolli, M. & Tedeschi, R. Entropy algebras and Birkhoff factorization, arXiv, Vol. abs/1108.2874, 2014.
[13] Doering, A., Isham, C.J., Classical and Quantum Probabilities as Truth Values, arXiv:1102.2213, 2011 PDF
[14] Baez, J.; Fritz, T. & Leinster, T. A Characterization of Entropy in Terms of Information Loss Entropy, 13, 19451957, 2011. PDF
[15] Baez J. C.; Fritz, T. A Bayesian characterization of relative entropy. Theory and Applications of Categories, Vol. 29, No. 16, p. 422456. 2014. PDF
[16] DrummondCole, G.C., Park., J.S., Terrila, J., Homotopy probability theory I. J. Homotopy Relat. Struct. November 2013. PDF
[17] DrummondCole, G.C., Park., J.S., Terrila, J., Homotopy probability theory II. J. Homotopy Relat. Struct. April 2014. PDF
[18] Burgos Gil J.I., Philippon P., Sombra M., Arithmetic geometry of toric varieties. Metrics, measures and heights, Astérisque 360. 2014 . PDF
[19] Gromov, M. Symmetry, probability, entropy. Entropy 2015. PDF
[20] Gromov, M. Morse Spectra, Homology Measures, Spaces of Cycles and Parametric Packing Problems, april 2015. PDF
[21] Park., J.S., Homotopy theory of probability spaces I: classical independence and homotopy Lie algebras. Archiv . 2015
[22] Baudot P., Bennequin D. The homological nature of entropy. Entropy, 17, 166; 2015. PDF
[23] ElbazVincent, P., Gangl, H., Finite polylogarithms, their multiple analogues and the Shannon entropy. (2015) Vol. 9389 Lecture Notes in Computer Science. 277285, Archiv.
[24] M. Marcolli, Information algebras and their applications. International Conference on Geometric Science of Information (2015), 271276
[25] Abramsky S., Barbosa R.S., Lal K.K.R., Mansfield, S., Contextuality, Cohomology and Paradox. 2015. arXiv:1502.03097
[26] M. Nguiffo Boyom, FoliationsWebsHessian GeometryInformation GeometryEntropy and Cohomology. Entropy 18(12): 433 (2016) PDF
[27] M. Nguiffo Boyom, A. Zeglaoui, Amari Functors and Dynamics in Gauge Structures. GSI 2017: 170178
[28] G.C. DrummondCole, Terila, Homotopy probability theory on a Riemannian manifold and the Euler equation , New York Journal of Mathematics, Volume 23 (2017) 10651085. PDF
[29] P. Forré,, JM. Mooij. Constraintbased Causal Discovery for NonLinear Structural Causal Models with Cycles and Latent Confounders. In A. Globerson, & R. Silva (Eds.) (2018), pp. 269278)
[30] T. Fritz and P. Perrone, Bimonoidal Structure of Probability Monads. Proceedings of MFPS 34, ENTCS, (2018). PDF
[31] JaeSuk Park, Homotopical Computations in Quantum Fields Theory, (2018) arXiv:1810.09100 PDF
[32] G.C. DrummondCole, An operadic approach to operatorvalued free cumulants. Higher Structures (2018) 2, 42–56. PDF
[33] G.C. DrummondCole, A noncrossing word cooperad for free homotopy probability theory. MATRIX Book (2018) Series 1, 77–99. PDF
[34] T. Fritz and P. Perrone, A Probability Monad as the Colimit of Spaces of Finite Samples, Theory and Applications of Categories 34, 2019. PDF.
[35] M. Esfahanian, A new quantum probability theory, quantum information functor and quantum gravity. (2019) PDF
[36] T. Leinster, Entropy modulo a prime, (2019) arXiv:1903.06961 PDF
[37] T. Leinster, E. Roff, The maximum entropy of a metric space, (2019) arXiv:1908.11184 PDF
[38] T. Maniero, Homological Tools for the Quantum Mechanic. arXiv 2019, arXiv:1901.02011. PDF
[39] M. Marcolli, Motivic information, Bollettino dell'Unione Matematica Italiana (2019) 12 (12), 1941
[40] J.P. Vigneaux, Information theory with finite vector spaces, in IEEE Transactions on Information Theory, vol. 65, no. 9, pp. 56745687, Sept. (2019)
[41] Baudot P., Tapia M., Bennequin, D. , Goaillard J.M., Topological Information Data Analysis. (2019), Entropy, 21(9), 869
[42] Baudot P., The PoincaréShannon Machine: Statistical Physics and Machine Learning aspects of Information Cohomology. (2019), Entropy , 21(9),
[43] G. SergeantPerthuis, Bayesian/Graphoid intersection property for factorisation models, (2019), arXiv:1903.06026
[44] J.P. Vigneaux, Topology of Statistical Systems: A Cohomological Approach to Information Theory, PhD Thesis (2019).
[45] Forré, P., & Mooij, J. M. (2019). Causal Calculus in the Presence of Cycles, Latent Confounders and Selection Bias. In A. Globerson, & R. Silva (Eds.), Proceedings of the ThirtyFifth Conference on Uncertainty in Artificial Intelligence: UAI 2019, (2019)
[46] Y. Manin, M. Marcolli Homotopy Theoretic and Categorical Models of Neural Information Networks. arXiv (2020) preprint arXiv:2006.15136
[47] T. Leinster The categorical origins of Lebesgue integration (2020) arXiv:2011.00412 PDF
[48] T. Fritz, T. Gonda, P. Perrone, E. Fjeldgren Rischel, Representable Markov Categories and Comparison of Statistical Experiments in Categorical Probability. (2020) arXiv:2010.07416 PDF
[49] T. Fritz, E. Fjeldgren Rischel, Infinite products and zeroone laws in categorical probability (2020) arXiv:1912.02769 PDF
[50] T. Fritz, A synthetic approach to Markov kernels, conditional independence and theorems on sufficient statistics (2020) arXiv:1908.07021 PDF
[51] T. Fritz and P. Perrone, Stochastic Order on Metric Spaces and the Ordered Kantorovich Monad, Advances in Mathematics 366, 2020. PDF
[52] T. Fritz and P. Perrone, Monads, partial evaluations, and rewriting. Proceedings of MFPS 36, ENTCS, 2020. PDF.
[53] D. Bennequin. G. SergeantPerthuis, O. Peltre, and J.P. Vigneaux, Extrafine sheaves and interaction decompositions, (2020) arXiv:2009.12646
[54] J.P. Vigneaux, Information structures and their cohomology, in Theory and Applications of Categories, Vol. 35, (2020), No. 38, pp 14761529.
[55] O. Peltre, MessagePassing Algorithms and Homology, PhD Thesis (2020), arXiv:2009.11631
[56] G. SergeantPerthuis, Interaction decomposition for presheafs, (2020) arXiv:2008.09029
[57] K. Hess, Topological adventures in neuroscience, in the Proceedings of the 2018 Abel Symposium: Topological Data Analysis, Springer Verlag, (2020).
[58] C. Curto, N. Youngs. Neural ring homomorphisms and maps between neural codes. Submitted. arXiv.org preprint.
[59] N.C. Combe, Y, Manin, Fmanifolds and geometry of information, arXiv:2004.08808v.2, (2020) Bull. London MS.
[60] Abramsky, S. , Classical logic, classical probability, and quantum mechanics 2020 arXiv:2010.13326 
GeoSciInfo
Topology and geometry in neuroscience
chairs of the sessions
Topics
This session will focus on the advances on Algebraic Topology and Geometrical methods in neurosciences (see [1105] bellow, among many others). The field is currently knowing an impressive development coming both:_ from theoretical neuroscience and machine learning fields, like Graph Neural Networks [3042], Bayesian geometrical inference [2729], Message Passing, probability and cohomology [9295], Information Topology [5354,6266,96105] or Networks [8385,9091], higher order nbody statistical interactions [67,74,9495,99,101]
_ from topological data analysis applications to real neural recordings, ranging from subcellular [43,51] genetic or omic expressions [81,101], spiking dynamic and neural coding [125,4547,5052,79], to cortical areas fMRI, EEG [26,6772,7680,8489], linguistic [5461] and consciousness [48,53,102].
Bibliographical references: (to be completed)
Carina Curto, Nora Youngs and Vladimir Itskov and colleagues:
[1] C. Curto, N. Youngs. Neural ring homomorphisms and maps between neural codes. Submitted. arXiv.org preprint.
[2] C. Curto, J. Geneson, K. Morrison. Fixed points of competitive thresholdlinear networks. Neural Computation, in press, 2019. arXiv.org preprint.
[3] C. Curto, A. VelizCuba, N. Youngs. Analysis of combinatorial neural codes: an algebraic approach. Book chapter in Algebraic and Combinatorial Computational Biology. R. Robeva, M. Macaulay (Eds), 2018.
[4] C. Curto, V. Itskov. Combinatorial neural codes. Handbook of Discrete and Combinatorial Mathematics, Second Edition, edited by Kenneth H. Rosen, CRC Press, 2018. pdf
[5] C. Curto, E. Gross, J. Jeffries, K. Morrison, M. Omar, Z. Rosen, A. Shiu, N. Youngs. What makes a neural code convex? SIAM J. Appl. Algebra Geometry, vol. 1, pp. 222238, 2017. pdf, SIAGA link, and arXiv.org preprint
[6] C. Curto. What can topology tells us about the neural code? Bulletin of the AMS, vol. 54, no. 1, pp. 6378, 2017. pdf, Bulletin link.
[7] C. Curto, K. Morrison. Pattern completion in symmetric thresholdlinear networks. Neural Computation, Vol 28, pp. 28252852, 2016. pdf, arXiv.org preprint.
[8] C. Giusti, E. Pastalkova, C. Curto, V. Itskov. Clique topology reveals intrinsic geometric structure in neural correlations. PNAS, vol. 112, no. 44, pp. 1345513460, 2015. pdf, PNAS link.
[9] C. Curto, A. Degeratu, V. Itskov. Encoding binary neural codes in networks of thresholdlinear neurons. Neural Computation, Vol 25, pp. 28582903, 2013. pdf, arXiv.org preprint.
[10] K. Morrison, C. Curto. Predicting neural network dynamics via graphical analysis. Book chapter in Algebraic and Combinatorial Computational Biology. R. Robeva, M. Macaulay (Eds), 2018. arXiv.org preprint,
[11] C. Curto, V. Itskov, A. VelizCuba, N. Youngs. The neural ring: an algebraic tool for analyzing the intrinsic structure of neural codes. Bulletin of Mathematical Biology, Volume 75, Issue 9, pp. 15711611, 2013. arXiv.org preprint.
[12] C. Curto, V. Itskov, K. Morrison, Z. Roth, J.L. Walker. Combinatorial neural codes from a mathematical coding theory perspective. Neural Computation, Vol 25(7):18911925, 2013. arXiv.org preprint.
[13] C. Curto, A. Degeratu, V. Itskov. Flexible memory networks. Bulletin of Mathematical Biology, Vol 74(3):590614, 2012. arXiv.org preprint.
[14] V. Itskov, C. Curto, E. Pastalkova, G. Buzsaki. Cell assembly sequences arising from spike threshold adaptation keep track of time in the hippocampus. Journal of Neuroscience, Vol. 31(8):28282834, 2011.
[15] K.D. Harris, P. Bartho, et al.. How do neurons work together? Lessons from auditory cortex. Hearing Research, Vol. 271(12), 2011, pp. 3753.
[16] P. Bartho, C. Curto, A. Luczak, S. Marguet, K.D. Harris. Population coding of tone stimuli in auditory cortex: dynamic rate vector analysis. European Journal of Neuroscience, Vol. 30(9), 2009, pp. 17671778.
[17] C. Curto, V. Itskov. Cell groups reveal structure of stimulus space. PLoS Computational Biology, Vol. 4(10): e1000205, 2008.
[18] E. Gross , N. K. Obatake , N. Youngs, Neural ideals and stimulus space visualization, Adv. Appl.Math., 95 (2018), pp. 65–95.
[19] C. Giusti, V. Itskov. A nogo theorem for onelayer feedforward networks. Neural Computation, 26 (11):25272540, 2014.
[20] V. Itskov, L.F. Abbott. Capacity of a Perceptron for Sparse Discrimination . Phys. Rev. Lett. 101(1), 2008.
[21] V. Itskov, E. Pastalkova, K. Mizuseki, G. Buzsaki, K.D. Harris. Thetamediated dynamics of spatial information in hippocampus. Journal of Neuroscience, 28(23), 2008.
[22] V. Itskov, C. Curto, K.D. Harris. Valuations for spike train prediction. Neural Computation, 20(3), 644667, 2008.
[23] E. Pastalkova, V. Itskov , A. Amarasingham , G. Buzsaki. Internally Generated Cell Assembly Sequences in the Rat Hippocampus. Science 321(5894):1322  1327, 2008.
[24] V. Itskov, A. Kunin, Z. Rosen. Hyperplane neural codes and the polar complex. To appear in the Abel Symposia proceedings, Vol. 15, 2019.Alexander Ruys de Perez and colleagues:
[25] A. Ruys de Perez, L.F. Matusevich, A. Shiu, Neural codes and the factor complex, Advances in Applied Mathematics 114 (2020).
Sunghyon Kyeong and colleagues:
[26] Sunghyon Kyeong, Seonjeong Park, KeunAh Cheon, JaeJin Kim, DongHo Song, and Eunjoo Kim, A New Approach to Investigate the Association between Brain Functional Connectivity and Disease Characteristics of AttentionDeficit/Hyperactivity Disorder: Topological Neuroimaging Data Analysis, PLOS ONE, 10 (9): e0137296, DOI: 10.1371/journal.pone.0137296 (2015)
Jonathan Pillow and colleagues:
[27] Aoi MC & Pillow JW (2017). Scalable Bayesian inference for highdimensional neural receptive fields. bioRxiv 212217; doi: https://doi.org/10.1101/212217
[28] Aoi MC, Mante V, & Pillow JW. (2020). Prefrontal cortex exhibits multidimensional dynamic encoding during decisionmaking. Nat Neurosci.
[29] Calhoun AJ, Pillow JW, & Murthy M. (2019). Unsupervised identification of the internal states that shape natural behavior. Nature Neuroscience 22:204020149.
[30] Dong X, Thanou D, Toni L, et al., 2020, Graph Signal Processing for Machine Learning: A Review and New Perspectives, Ieee Signal Processing Magazine, Vol:37, ISSN:10535888, Pages:117127Michael Bronstein, Federico Monti, Giorgos Bouritsas and colleagues:
[31] G. Bouritsas, F. Frasca, S Zafeiriou, MM Bronstein, Improving graph neural network expressivity via subgraph isomorphism counting. arXiv (2020) preprint arXiv:2006.09252
[32] M. Bronstein , G. Pennycook, L. Buonomano, T.D. Cannon, Belief in fake news, responsiveness to cognitive conflict, and analytic reasoning engagement, Thinking and Reasoning (2020), ISSN: 13546783
[33] X. Dong, D. Thanou, L. Toni, M. Bronstein, P. Frossard, Graph Signal Processing for Machine Learning: A Review and New Perspectives, IEEE Signal Processing Magazine (2020), Vol: 37, Pages: 117127, ISSN: 10535888
[34] Y. Wang, Y. Sun, Z. Liu, S.E. Sarma, M. Bronstein, J.M. Solomon, Dynamic Graph CNN for Learning on Point Clouds, ACM Transactions on graphics (2020), Vol: 38, ISSN: 07300301
[35] M. Bronstein, J. Everaert, A. Castro, J. Joormann, T. D. Cannon, Pathways to paranoia: Analytic thinking and belief flexibility., Behav Res Ther (2019), Vol: 113, Pages: 1824
[36] G. Bouritsas, S. Bokhnyak, S. Ploumpis, M. Bronstein, S. Zafeiriou, Neural 3D Morphable Models: Spiral Convolutional Networks for 3D Shape Representation Learning and Generation, (2019) IEEE/CVF ICCV 2019, 7212
[37] O. Litany, A. Bronstein, M. Bronstein, A. Makadia et al., Deformable Shape Completion with Graph Convolutional Autoencoders (2018), Pages: 18861895, ISSN: 10636919
[38] R. Levie, F. Monti, X. Bresson X, M. Bronstein, CayleyNets: Graph Convolutional Neural Networks with Complex Rational Spectral Filters, IEEE Transactions on Signal Processing (2018), Vol: 67, Pages: 97109, ISSN: 1053587X
[39] F. Monti, K. Otness, M. Bronstein, Motifnet: a motifbased graph convolutional network for directed graphs (2018), Pages: 225228
[40] F. Monti, M. Bronstein, X. Bresson, Geometric matrix completion with recurrent multigraph neural networks, Neural Information Processing Systems (2017), Pages: 37003710, ISSN: 10495258
[41] F. Monti F, D. Boscaini, J. Masci, E. Rodola, J. Svoboda, M. Bronstein, Geometric deep learning on graphs and manifolds using mixture model CNNs, (2017) IEEE Conference on Computer Vision and Pattern Recognition, p: 33
[42] M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, P. Vandergheynst et al., Geometric Deep Learning Going beyond Euclidean data, IEEE Signal Processing Magazine (2017), Vol: 34, Pages: 1842, ISSN: 10535888Kathryn Hess and colleagues:
[43] L. Kanari, H. Dictus, W. Van Geit, A. Chalimourda, B. Coste, J. Shillcock, K. Hess, and H. Markram, Computational synthesis of cortical dendritic morphologies, bioRvix (2020) 10.1101/2020.04.15.040410, submitted.
[44] G. Tauzin, U. Lupo, L. Tunstall, J. Burella Prez, M. Caorsi, A. MedinaMardones, A, Dassatti, and K. Hess, giottotda: a topological data analysis toolkit for machine learning and data exploration, arXiv:2004.02551
[45] E. Mullier, J. Vohryzek, A. Griffa, Y. AlemànGómez, C. Hacker, K. Hess, and P. Hagmann, Functional brain dynamics are shaped by connectome nsimplicial organization, (2020) submitted.
[46] M. Fournier, M. Scolamiero, etal., Topology predicts longterm functional outcome in early psychosis, Molecular Psychiatry (2020). https://doi.org/10.1038/s4138002008261.
[47] K. Hess, Topological adventures in neuroscience, in the Proceedings of the 2018 Abel Symposium: Topological Data Analysis, Springer Verlag, (2020).
[48] A. Doerig, A. Schurger, K. Hess, and M. H. Herzog, The unfolding argument: why IIT and other causal structure theories of consciousness are empirically untestable, Consciousness and Cognition 72 (2019) 4959.
[49] L. Kanari, S. Ramaswamy, et al., Objective classification of neocortical pyramidal cells, Cerebral Cortex (2019) bhy339, https://doi.org/10.1093/cercor/bhy339.
[50] J.B. Bardin, G. Spreemann, K. Hess, Topological exploration of artificial neuronal network dynamics, Network Neuroscience (2019) https://doi.org/10.1162/netn_a_00080.
[51] L. Kanari, P. Dłotko, M. Scolamiero, R. Levi, J. C. Shillcock, K. Hess, and H. Markram, A topological representation of branching morphologies, Neuroinformatics (2017) doi: 10.1007/s1202101793411.
[52] M. W. Reimann, M. Nolte,et al., Cliques of neurons bound into cavities provide a missing link between structure and function, Front. Comput. Neurosci., 12 June (2017), doi: 10.3389/fncom.2017.00048.Mathilde Marcoli, Yuri Manin, and colleagues:
[53] Y. Manin, M. Marcolli Homotopy Theoretic and Categorical Models of Neural Information Networks. arXiv (2020) preprint arXiv:2006.15136
[54] M. Marcolli, Lumen Naturae: Visions of the Abstract in Art and Mathematics, MIT Press (2020)
[55] A. Port, T. Karidi, M. Marcolli, Topological Analysis of Syntactic Structures (2019) arXiv preprint arXiv:1903.05181
[56] M. Marcolli, Motivic information, Bollettino dell'Unione Matematica Italiana (2019) 12 (12), 1941
[57] A. Port, I. Gheorghita, D. Guth, J.M. Clark, C. Liang, S. Dasu, M. Marcolli, Persistent topology of syntax, Mathematics in Computer Science (2018) 12 (1), 3350 20
[58] K. Shu, S. Aziz, VL Huynh, D Warrick, M Marcolli, Syntactic phylogenetic trees, Foundations of Mathematics and Physics One Century After Hilbert (2018), 417441
[59] K. Shu, A. Ortegaray, R Berwick, M Marcolli Phylogenetics of IndoEuropean language families via an algebrogeometric analysis of their syntactic structures. arXiv (2018) preprint arXiv:1712.01719
[60] K. Shu, M. Marcolli, Syntactic structures and code parameters Mathematics in Computer Science (2018) 11 (1), 7990
[61] K Siva, J Tao, M Marcolli. Syntactic Parameters and Spin Glass Models of Language Change Linguist. Anal (2017) 41 (34), 559608
[62] M. Marcolli, N. Tedeschi, Entropy algebras and Birkhoff factorization. Journal of Geometry and Physics (2015) 97, 243265
[63] M. Marcolli, Information algebras and their applications. International Conference on Geometric Science of Information (2015), 271276
[64] K. Siva, J. Tao, M. Marcolli Spin glass models of syntax and language evolution, arXiv preprint (2015) arXiv:1508.00504
[65] Y. Manin, M. Marcolli, Kolmogorov complexity and the asymptotic bound for errorcorrecting codes Journal of Differential Geometry (2014) 97 (1), 91108
[66] M. Marcolli, R. Thorngren, Thermodynamic semirings, ArXiv preprint (2011) arXiv:1108.2874Bosa Tadić and colleagues:
[67] M. Andjelkovic, B. Tadic, R. Melnik, The topology of higherorder complexes associated with brainfunction hubs in human connectomes , available on arxiv.org/abs/2006.10357, published in Scientific Reports 10:17320 (2020)
[68] B. Tadic, M. Andjelkovic, M. Suvakov, G.J. Rodgers, Magnetisation Processes in Geometrically Frustrated Spin Networks with SelfAssembled Cliques, Entropy 22(3), 336 (2020)
[69] B. Tadic, M. Andjelkovic, R. Melnik, Functional Geometry of Human Connectomes published in ScientificReports Nature:ScientificReports 9:12060 (2019) previous version: Functional Geometry of Human Connectome and Robustness of Gender Differences, arXiv preprint arXiv:1904.03399 April 6, 2019
[70] B. Tadic, M. Andjelkovic, M. Suvakov, Origin of hyperbolicity in braintobrain coordination networks, FRONTIERS in PHYSICS vol.6, ARTICLE{10.3389/fphy.2018.00007}, (2018) OA
[71] B. Tadic, M. Andjelkovic, Algebraic topology of multibrain graphs: Methods to study the social impact and other factors onto functional brain connections, in Proceedings of BELBI (2016)
[72] B. Tadic, M. Andjelkovic, B.M. Boskoska, Z. Levnajic, Algebraic Topology of MultiBrain Connectivity Networks Reveals Dissimilarity in Functional Patterns during Spoken Communications, PLOS ONE Vol 11(11), e0166787 (2016)
[73] M. Mitrovic and B. Tadic, Search for Weighted Subgraphs on Complex Networks with MLM, Lecture Notes in Computer Science, Vol. 5102 pp. 551558 (2008)Giovanni Petri, Francesco Vaccarino and collaborators:
[74] F. Battiston, G. Cencetti, et al., Networks beyond pairwise interactions: structure and dynamics, Physics Reports (2020), arXiv:2006.01764
[75] M. Guerra, A. De Gregorio, U. Fugacci, G. Petri, F. Vaccarino, Homological scaffold via minimal homology bases. arXiv (2020) preprint arXiv:2004.11606
[76] J. Billings, R. Tivadar, M.M. Murray, B. Franceschiello, G. Petri, Topological Features of Electroencephalography are ReferenceInvariant, bioRxiv 2020
[77] J. Billings, M. Saggar, S. Keilholz, G. Petri, Topological Segmentation of TimeVarying Functional Connectivity Highlights the Role of Preferred Cortical Circuits, bioRxiv 2020
[78] E. IbáñezMarcelo, L. Campioni, et al., Topology highlights mesoscopic functional equivalence between imagery and perception: The case of hypnotizability. NeuroImage (2019) 200, 437449
[79] P. Expert, L.D. Lord, M.L. Kringelbach, G. Petri. Topological neuroscience. Network Neuroscience (2019) 3 (3), 653655
[80] C. Geniesse, O. Sporns, G. Petri, M. Saggar, Generating dynamical neuroimaging spatiotemporal representations (DyNeuSR) using topological data analysis. Network Neuroscience (2019) 3 (3), 763778
[81] A. Patania, P. Selvaggi, M. Veronese, O. Dipasquale, P. Expert, G. Petri, Topological gene expression networks recapitulate brain anatomy and function. Network Neuroscience (2019) 3 (3), 744762
[82] E. Ibáñez‐Marcelo, L. Campioni, D.et al.. Spectral and topological analyses of the cortical representation of the head position: Does hypnotizability matter? Brain and behavior (2018) 9 (6), e01277
[83] G. Petri, A. Barrat, Simplicial activity driven model, Physical review letters 121 (22), 228301
[84] A. Phinyomark, E. IbanezMarcelo, G. Petri. Restingstate fmri functional connectivity: Big data preprocessing pipelines and topological data analysis. IEEE Transactions on Big Data (2017) 3 (4), 415428
[85] G. Petri, S. Musslick, B. Dey, K. Ozcimder, D. Turner, N.K. Ahmed, T. Willke. Topological limits to parallel processing capability of network architectures. arXiv preprint (2017) arXiv:1708.03263
[86] K. Ozcimder, B. Dey, S. Musslick, G. Petri, N.K. Ahmed, T.L. Willke, J.D. Cohen, A Formal Approach to Modeling the Cost of Cognitive Control, arXiv preprint (2017) arXiv:1706.00085
[87] L.D. Lord, P. Expert, et al. , Insights into brain architectures from the homological scaffolds of functional connectivity networks, Frontiers in systems neuroscience (2016) 10, 85
[88] J. Binchi, E. Merelli, M. Rucco, G. Petri, F. Vaccarino. jHoles: A Tool for Understanding Biological Complex Networks via Clique Weight Rank Persistent Homology. Electron. Notes Theor. Comput. Sci. (2014) 306, 518
[89] G. Petri, P. Expert, F. Turkheimer, R. CarhartHarris, D. Nutt, P.J. Hellyer et al., Homological scaffolds of brain functional networks. Journal of The Royal Society Interface (2014) 11 (101), 20140873
[90] G. Petri, M. Scolamiero, I. Donato, F. Vaccarino, Topological strata of weighted complex networks. PloS one (2013) 8 (6), e66506
[91] G. Petri, M. Scolamiero, I. Donato, ., Networks and cycles: a persistent homology approach to complex networks Proceedings of the european conference on complex systems (2013), 9399Daniel Bennequin, JuanPablo Vigneaux, Olivier Peltre, Pierre Baudot and colleagues:
[92] D. Bennequin. G. SergeantPerthuis, O. Peltre, and J.P. Vigneaux, Extrafine sheaves and interaction decompositions, (2020) arXiv:2009.12646
[93] O. Peltre, MessagePassing Algorithms and Homology, PhD Thesis (2020), arXiv:2009.11631
[94] G. SergeantPerthuis, Interaction decomposition for presheafs, (2020) arXiv:2008.09029
[95] G. SergeantPerthuis, Bayesian/Graphoid intersection property for factorisation models, (2019), arXiv:1903.06026
[96] J.P. Vigneaux, Topology of Statistical Systems: A Cohomological Approach to Information Theory, PhD Thesis (2019).
[97] J.P. Vigneaux, Information structures and their cohomology, in Theory and Applications of Categories, Vol. 35, (2020), No. 38, pp 14761529.
[98] J.P. Vigneaux, Information theory with finite vector spaces, in IEEE Transactions on Information Theory, vol. 65, no. 9, pp. 56745687, Sept. (2019)
[99] Baudot P., Tapia M., Bennequin, D. , Goaillard J.M., Topological Information Data Analysis. (2019), Entropy, 21(9), 869
[100] Baudot P., The PoincaréShannon Machine: Statistical Physics and Machine Learning aspects of Information Cohomology. (2019), Entropy , 21(9),
[101] Tapia M., Baudot P., et al. Neurotransmitter identity and electrophysiological phenotype are genetically coupled in midbrain dopaminergic neurons. Scientific Reports. (2018). BioArXiv168740
[102] Baudot P., Elements of qualitative cognition: an Information Topology Perspective. Physics of Life Reviews. (2019) Arxiv. arXiv:1807.04520
[103] Baudot P., Bennequin D., The homological nature of entropy. Entropy, (2015), 17, 166; doi:10.3390
[104] D. Bennequin. Remarks on Invariance in the Primary Visual Systems of Mammals, pages 243–333. Neuromathematics of Vision Part of the series Lecture Notes in Morphogenesis Springer, 2014.
[105] Baudot P., Bennequin D., Information Topology I and II. Random models in Neuroscience (2012) 
GeoSciInfo
Max WELLING
Informatics Institute, University of Amsterdam and Qualcomm Technologies
https://staff.fnwi.uva.nl/m.welling/
ELLIS Board Member (European Laboratory for Learning and Intelligent Systems: https://ellis.eu/)
Title: Exploring Quantum Statistics for Machine Learning
Abstract: Quantum mechanics represents a rather bizarre theory of statistics that is very different from the ordinary classical statistics that we are used to. In this talk I will explore if there are ways that we can leverage this theory in developing new machine learning tools: can we design better neural networks by thinking about entangled variables? Can we come up with better samplers by viewing them as observations in a quantum system? Can we generalize probability distributions? We hope to develop better algorithms that can be simulated efficiently on classical computers, but we will naturally also consider the possibility of much faster implementations on future quantum computers. Finally, I hope to discuss the role of symmetries in quantum theories.
Reference:
Roberto Bondesan, Max Welling, Quantum Deformed Neural Networks, arXiv:2010.11189v1 [quantph], 21st October 2020 ; https://arxiv.org/abs/2010.11189Jean PETITOT
Directeur d'Études, Centre d'Analyse et de Mathématiques, Sociales, École des Hautes Études, Paris.
Born in 1944, Jean Petitot is an applied mathematician interested in dynamical modeling in neurocognitive sciences. He is the former director of the CREA (Applied Epistemology Research Center) at the Ecole Polytechnique.Philisopher of science http://jeanpetitot.com
Title : The primary visual cortex as a Cartan engine
Abstract: Cortical visual neurons detect very local geometric cues as retinal positions, local contrasts, local orientations of boundaries, etc. One of the main theoretical problem of low level vision is to understand how these local cues can be integrated so as to generate the global geometry of the images perceived, with all the wellknown phenomena studied since Gestalt theory. It is an empirical evidence that the visual brain is able to perform a lot of routines belonging to differential geometry. But how such routines can be neurally implemented ? Neurons are « pointlike » processors and it seems impossible to do differential geometry with them. Since the 1990s, methods of "in vivo optical imaging based on activitydependent intrinsic signals" have made possible to visualize the extremely special connectivity of the primary visual areas, their “functional architectures.” What we called « Neurogeometry » is based on the discovery that these functional architectures implement structures such as the contact structure and the subRiemannian geometry of jet spaces of plane curves. For reasons of principle, it is the geometrical reformulation of differential calculus from Pfaff to Lie, Darboux, Frobenius, Cartan and Goursat which turns out to be suitable for neurogeometry.
References:
 Agrachev, A., Barilari, D., Boscain, U., A Comprehensive Introduction to SubRiemannian Geometry, Cambridge University Press, 2020.
 Citti, G., Sarti, A., A cortical based model of perceptual completion in the rototranslation space, Journal of Mathematical Imaging and Vision, 24, 3 (2006) 307326.
 Petitot, J., Neurogéométrie de la vision. Modèles mathématiques et physiques des architectures fonctionnelles, Les Éditions de l'École Polytechnique, Distribution Ellipses, Paris, 2008.
 Petitot, J., “Landmarks for neurogeometry”, Neuromathematics of Vision, (G. Citti, A. Sarti eds), Springer, Berlin, Heidelberg, 185,
 Petitot,J., Elements of Neurogeometry. Functional Architectures of Vision, Lecture Notes in Morphogenesis, Springer, 2017.
 Prandi, D., Gauthier, J.P., A Semidiscrete Version of the Petitot Model as a Plausible Model for Anthropomorphic Image Reconstruction and Pattern Recognition, https://arxiv.org/abs/1704.03069v1, 2017.
Yvette KosmannSchwarzbach
Professeur des universités honoraire ; former student of the Ecole normale supérieure Sèvres, 19601964; aggregation of mathematics, 1963; CNRS research associate, 19641969; doctorate in science, Lie derivatives of spinors, University of Paris, 1970 under supervision of André Lichnerowicz; lecturer then professor at the University of Lille (19701976 and 19821993), at Brooklyn College, New York (19791982), at the École polytechnique (19932006)Title: Structures of Poisson Geometry: old and new
Abstract: How did the brackets that SiméonDenis Poisson introduce in 1809 evolve into the Poisson geometry of the 1970's? What are Poisson groups and, more generally, Poisson groupoids? In what sense does Dirac geometry generalize Poisson geometry and why is it relevant for applications? I shall sketch the definition of these structures and try to answer these questions.
References
 P. Libermann and C.M. Marle, Symplectic Geometry and Analytical Mechanics, D. Reidel Publishing Company (1987).
 J. E. Marsden and T. S. Ratiu, Introduction to Mechanics and Symmetry, Texts in Applied Mathematics 17, second edition, Springer (1998).
 C. LaurentGengoux, A. Pichereau, and P. Vanhaecke, Poisson Structures, Grundlehren der mathematischen Wissenschaften 347, Springer (2013).
 Y. KosmannSchwarzbach, Multiplicativity from Lie groups to generalized geometry, in Geometry of Jets and Fields (K. Grabowska et al., eds), Banach Center Publications 110, 2016.
 Special volume of LMP on Poisson Geometry, guest editors, Anton Alekseev, Alberto Cattaneo, Y. KosmannSchwarzbach, and Tudor Ratiu, Letters in Mathematical Physics 90, 2009.
 Y. KosmannSchwarzbach (éd.), SiméonDenis Poisson : les Mathématiques au service de la science, Editions de l'Ecole Polytechnique (2013).
 Y. KosmannSchwarzbach, The Noether Theorems: Invariance and Conservation Laws in the Twentieth Century, translated by B. E. Schwarzbach, Sources and Studies in the History of Mathematics and Physical Sciences, Springer (2011).
Michel Broniatowski
Sorbonne Université, Paris
Title: Some insights on statistical divergences and choice of models
Abstract: Divergences between probability laws or more generally between measures define inferential criteria, or risk functions. Their estimation makes it possible to deal with the questions of model choice and statistical inference, in connection with the regularity of the models considered; depending on the nature of these models (parametric or semiparametric), the nature of the criteria and their estimation methods vary. Representations of these divergences as large deviation rates for specific empirical measures allow their estimation in nonparametric or semi parametric models, by making use of information theory results (Sanov's theorem and Gibbs principles), by Monte Carlo methods. The question of the choice of divergence is wide open; an approach linking nonparametric Bayesian statistics and MAP estimators provides elements of understanding of the specificities of the various divergences in the AliSilveyCsiszarArimoto class in relation to the specific choices of the prior distributions.
References:
 Broniatowski, Michel ; Stummer, Wolfgang. Some universal insights on divergences for statistics, machine learning and artificial intelligence. In Geometric structures of information; Signals Commun. Technol., Springer, Cham, pp. 149.211, 2019
 Broniatowski, Michel. Minimum divergence estimators, Maximum Likelihood and the generalized bootstrap, to appear in "Divergence Measures: Mathematical Foundations and Applications in InformationTheoretic and Statistical Problems" Entropy, 2020
 Csiszár, Imre ; Gassiat, Elisabeth. MEM pixel correlated solutions for generalized moment and interpolation problems. IEEE Trans. Inform. Theory 45, no. 7, 2253–2270, 1999
 Liese, Friedrich; Vajda, Igor. On divergences and informations in statistics and information theory. IEEE Trans. Inform. Theory 52, no. 10, 4394–4412, 2006
Maurice de Gosson
Professor, Senior Researcher at the University of Vienna https://homepage.univie.ac.at/maurice.de.gosson
Faculty of Mathematics, NuHAG groupTitle: Gaussian states from a symplectic geometry point of view
Abstract: Gaussian states play an ubiquitous role in quantum information theory and in quantum optics because they are easy to manufacture in the laboratory, and have in addition important extremality properties. Of particular interest are their separability properties. Even if major advances have been made in their study in recent years, the topic is still largely open. In this talk we will discuss separability questions for Gaussian states from a rigorous point of view using symplectic geometry, and present some new results and properties.
References:
 M. de Gosson, On the Disentanglement of Gaussian Quantum States by Symplectic Rotations. C.R. Acad. Sci. Paris Volume 358, issue 4, 459462 (2020)
 M. de Gosson, On Density Operators with Gaussian Weyl symbols, In Advances in Microlocal and TimeFrequency Analysis, Springer (2020)
 M. de Gosson, Symplectic CoarseGrained Classical and Semiclassical Evolution of Subsystems: New Theoretical Aspects, J. Math. Phys. no. 9, 092102 (2020)
 E. Cordero, M. de Gosson, and F. Nicola, On the Positivity of Trace Class Operators, to appear in Advances in Theoretical and Mathematical Physics 23(8), 2061–2091 (2019)
 E. Cordero, M. de Gosson, and F. Nicola, A characterization of modulation spaces by symplectic rotations, to appear in J. Funct. Anal. 278(11), 108474 (2020)
Giuseppe LONGO
Centre Cavaillès, CNRS & Ens Paris and School of Medicine, Tufts University, Boston http://www.di.ens.fr/users/longo/Title: Use and abuse of "digital information" in life sciences, is Geometry of Information a way out?
Abstract: Since WWII, the war of coding, and the understanding of the structure of the DNA (1953), the latter has been considered as the digital encoding of the Aristotelian Homunculus. Till now DNA is viewed as the "information carrier" of ontogenesis, the main or unique player and pilot of phylogenesis. This heavily affected our understanding of life and reinforced a mechanistic view of organisms and ecosystems, a component of our disruptive attitude towards ecosystemic dynamics. A different insight into DNA as a major constraint to morphogenetic processes brings in a possible "geometry of information" for biology, yet to be invented. One of the challenges is in the need to move from a classical analysis of morphogenesis, in physical terms, to a "heterogenesis" more proper to the historicity of biology.
References
 Arezoo Islami, Giuseppe Longo. Marriages of Mathematics and Physics: a challenge for Biology, Invited Paper, in The Necessary Western Conjunction to the Eastern Philosophy of Exploring the Nature of Mind and Life (K. Matsuno et al., eds), Special Issue of Progress in Biophysics and Molecular Biology, Vol 131, Pages 179¬192, December 2017. (DOI) (SpaceTimeIslamiLongo.pdf)
 Giuseppe Longo. How Future Depends on Past Histories and Rare Events in Systems of Life, Foundations of Science, (DOI), 2017 (biologobservhistoryfuture.pdf)
 Giuseppe Longo. Information and Causality: Mathematical Reflections on Cancer Biology. In Organisms. Journal of Biological Sciences, vo. 2, n. 1, 2018. (BiologicalConseqofCompute.pdf)
 Giuseppe Longo. Information at the Threshold of Interpretation, Science as Human Construction of Sense. In Bertolaso, M., Sterpetti, F. (Eds.) A Critical Reflection on Automated Science – Will Science Remain Human? Springer, Dordrecht, 2019 (InformationInterpretation.pdf)
 Giuseppe Longo, Matteo Mossio. Geocentrism vs genocentrism: theories without metaphors, metaphors without theories. In Interdisciplinary Science Reviews, 45 (3), pp. 380405, 2020. (Metaphorsgeogenocentrism.pdf)

GeoSciInfo
Welcome to “Geometric Science of Information” 202 Conference
On behalf of both the organizing and the scientific committees, it is our great pleasure to welcome all delegates, representatives and participants from around the world to the fifth International SEE conference on “Geometric Science of Information” (GSI’21), sheduled in July 2021.
GSI’21 benefits from scientific sponsor and financial sponsors.
The 3day conference is also organized in the frame of the relations set up between SEE and scientific institutions or academic laboratories such as Ecole Polytechnique, Ecole des Mines ParisTech, INRIA, CentraleSupélec, Institut Mathématique de Bordeaux, Sony Computer Science Laboratories.
The GSI conference cycle has been initiated by the Brillouin Seminar Team as soon as 2009. The GSI’21 event has been motivated in the continuity of first initiatives launched in 2013 (https://www.see.asso.fr/gsi2013) at Mines PatisTech, consolidated in 2015 (https://www.see.asso.fr/gsi2015) at Ecole Polytechnique and opened to new communities in 2017 (https://www.see.asso.fr/gsi2017) at Mines ParisTech and 2019 (https://www.see.asso.fr/gsi2019) at ENAC Toulouse. We mention that in 2011, we organized an indofrench workshop on “Matrix Information Geometry” that yielded an edited book in 2013, and in 2017, collaborate to CIRM seminar in Luminy TGSI’17 “Topoplogical & Geometrical Structures of Information” (http://forum.csdc.org/category/94/tgsi2017). Last GSI’19 Proceedings have been edited by SPRINGER in Lecture Notes (https://www.springer.com/gp/book/9783030269791).
GSI satellites event have been organized in 2019 and 2020 as, FGSI’19 “Foundation of Geometric Science of Information” in Montpellier and Les Houches Seminar SPIGL’20 “Joint Structures and Common Foundations of Statistical Physics, Information Geometry and Inference for Learning” .
The technical program of GSI’21 covers all the main topics and highlights in the domain of “Geometric Science of Information” including Information Geometry Manifolds of structured data/information and their advanced applications. This proceedings consists solely of original research papers that have been carefully peerreviewed by two or three experts before, and revised before acceptance.
Historical background
As for the GSI’13, GSI’15, GSI’17, and GSI’19 GSI'21 addresses interrelations between different mathematical domains like shape spaces (geometric statistics on manifolds and Lie groups, deformations in shape space, ...), probability/optimization & algorithms on manifolds (structured matrix manifold, structured data/Information, ...), relational and discrete metric spaces (graph metrics, distance geometry, relational analysis,...), computational and Hessian information geometry, geometric structures in thermodynamics and statistical physics, algebraic/infinite dimensional/Banach information manifolds, divergence geometry, tensorvalued morphology, optimal transport theory, manifold & topology learning, ... and applications like geometries of audioprocessing, inverse problems and signal/image processing. GSI’21 topics were enriched with contributions from Lie Group Machine Learning, Harmonic Analysis on Lie Groups, Geometric Deep Learning, Geometry of Hamiltonian Monte Carlo, Geometric & (Poly)Symplectic Integrators, Contact Geometry & Hamiltonian Control, Geometric and structure preserving discretizations, Probability Density Estimation & Sampling in High Dimension, Geometry of Graphs and Networks and Geometry in Neuroscience & Cognitive Sciences.
At the turn of the century, new and fruitful interactions were discovered between several branches of science: Information Science (information theory, digital communications, statistical signal processing,), Mathematics (group theory, geometry and topology, probability, statistics, sheaves theory,...) and Physics (geometric mechanics, thermodynamics, statistical physics, quantum mechanics, ...). GSI conference cycle is a tentative to discover joint mathematical structures to all these disciplines by elaboration of a “General Theory of Information” embracing physics science, information science, and cognitive science in a global scheme.
Frank Nielsen, cochair : Ecole Polytechnique, Palaiseau, France, Sony Computer Science Laboratories, Tokyo, Japan
Frédéric Barbaresco, cochair: President of SEE ISIC Club (Ingénierie des Systèmes d'Information et de Communications),
Representative of KTD PCC (Key Technology Domain / Processing, Computing & Cognition ) Board, THALES LAND & AIR SYSTEMS, France