Topological Data Analysis and Information Theory (online) - Institute of Advanced Study (IAS) - Amsterdam 5th July 2021
-
Topological Data Analysis and Information Theory (online)
Lecture Series
OFFICIAL WEBSITE
It is our pleasure to invite you to attend the lecture series on Topological Data Analysis and Information Theory, organised by IAS fellow Fernando Nobrega Santos and Rick Quax.
REGISTRATIONEvent details of Topological Data Analysis and Information Theory (online)
Date: Tuesday 29th June 2021 and monday 5th July 2021
Time: 14:00 -17:30
Organised by Fernando Nobrega Santos , Rick QuaxHigh-order interactions are interactions that go beyond a sequence of pairwise interactions. Multiple approaches exist that aim to detect and quantify high-order interactions that are qualitatively different. Two of the most prominent approaches are topological data analysis (TDA) and information theory (IT). Central questions addressed in this lecture series are: what do these two approaches have in common? How can they complement each other? And what could they bring to application domains, especially in neuroscience?
Programme
Tuesday 29 June 2021- 14:00-14:10 Opening by IAS
- 14.10-15.10 Lecture by Herbert Edelsbrunner: TDA for information theoretically motivated distances
- 15.10-15.20 Break
- 15:20-16:20 Lecture by Giovanni Petri: Social contagion and norm emergence on simplicial complexes and hypergraphs
- 16.20-16.30 Break
- 16:30-17:30 Lecture by Chad Giusti: A brief introduction to topological neuroscience
Monday 5 July 2021
- 14:00-14:10 Opening by IAS
- 14.10-15.10 Lecture by Rick Quax (UvA - IAS) Title: Brief introduction to information theory and the concept(s) of synergy
- 15.10-15.20 Break
- 15:20-16:20 Lecture by Fernando Rosas (Imperial College UK) Title: Towards a deeper understanding of high-order interdependencies in complex systems
- 16.20-16.30 Break
- 16:30-17:30 Lecture by Pierre Baudot (Median Technologies– France) Title: Information is Topology
Each lecture will be 50 min, followed by Q&A. To participate, register below.
Tuesday 29 June 2021
- First lecture Title: TDA for information theoretically motivated distances
Speaker: Herbert Edelsbrunner (IST Austria) VIDEO
Abstract: Given a finite set in a metric space, the topological analysis assesses its multi-scale connectivity quantified in terms of a $1$-parameter family of homology groups. Going beyond metrics, we show that the basic tools of topological data analysis also apply when we measure dissimilarity with Bregman divergences. A particularly interesting case is the relative entropy whose infinitesimal version is known as the Fisher information metric. It relates to the Euclidean metric on the sphere and, perhaps surprisingly, the discrete Morse properties of random data behaves the same as in Euclidean space.
Short bio: Herbert Edelsbrunner graduated in 1982 from the Graz University of Technology. He worked in Austria from 1982 to 85, in Illinois from 1985 to 99, and in North Carolina from 1999 to 2012, before joining IST Austria in 2009. He received the Waterman Award from the NSF in 1991 and the Wittgenstein Prize from the FWF in 2018. He is a member of Academies of Sciences in the US, in Germany, and in Austria. His primary research area is computational geometry and topology. His research focus is on computational geometry and topology. http://pub.ist.ac.at/~edels/
- Second lecture Title: Social contagion and norm emergence on simplicial complexes and hypergraphs
Speaker: Giovanni Petri (ISI Italy)
Abstract: Complex networks have been successfully used to describe dynamical processes of social and biological importance. Two classic examples are the spread of diseases and the emergence of shared norms in populations of networked interacting individuals. However, pairwise interactions are often not enough to fully characterize contagion or coordination processes, where influence and reinforcement are at work. Here we present recent results on the higher-order generalization of the SIS process and of the naming game. First, we numerically show that a higher-order contagion model displays novel phenomena, such as a discontinuous transition induced by higher-order interactions. We show analytically that the transition is discontinuous and that a bistable region appears where healthy and endemic states co-exist. Our results help explain why critical masses are required to initiate social changes and contribute to the understanding of higher-order interactions in complex systems. We then turn to the naming game as a prototypical example of norm emergence and show that higher-order interactions can create interesting novel phenomenologies, for example they can explain how --when communication among agents is inefficient-- even very small committed minorities are able to bring the system to a tipping point and flip the majority in the system. We conclude with an outlook on higher-order model, posing new questions and paving the way for modeling dynamical processes on these networks.
Short Bio: Giovanni Petri is a Senior Research Scientist at ISI Foundation in Italy, working on topological approaches to complex networks and their underlying geometry, with special attention to the topology of brain structure and dynamics.
- Third Lecture Title: A brief introduction to topological neuroscience
Speaker: Chad Giusti (University of Delaware - USA)
Abstract: Algebraic topology has the potential to become a fundamental tool in theoretical neuroscience, building on the foundations laid by network neuroscience, natively incorporating higher-order structure and a rich mathematical tool kit for describing qualitative structure in systems. In this talk I will briefly survey how topological methods have been applied to problems in neuroscience, then briefly discuss current directions and a few major challenges I see for the field.
Monday 5 July 2021
- First lecture: Title: Brief introduction to information theory and the concept(s) of synergy
Speaker: Rick Quax (UvA - IAS) VIDEO
Abstract: Quantifying synergy among stochastic variables is an important open problem in information theory. Information synergy occurs when multiple sources together predict an outcome variable better than the sum of single-source predictions. It is an essential phenomenon in biology such as in neuronal networks and cellular regulatory processes, where different information flows integrate to produce a single response, but also in social cooperation processes as well as in statistical inference tasks in machine learning. Here we propose a metric of synergistic entropy and synergistic information from first principles. Rick will start with a brief, need-to-know introduction of key information-theoretic notions (entropy, mutual information) and then move on to introducing the concept of synergistic information. He will highlight a few intuitive ways of trying to quantify synergistic information that exist today, including PID, a geometric approach, and his own proposed method.
Short bio: Rick’s ambition is to study Complex Adaptive Systems with a focus on emergent information processing in dynamical systems. He is trying to span the spectrum from theoretical foundations to application domains, ensuring that new theory insights have direct impact on application-oriented research and vice versa. He is currently Assistant Professor in the Computational Science Lab at the University of Amsterdam and member of IAS.
- Second lecture: Title: Towards a deeper understanding of high-order interdependencies in complex systems
Speaker: Fernando Rosas (Imperial College UK) VIDEO
Abstract: We live in an increasingly interconnected world and, unfortunately, our understanding of interdependency is still rather limited. As a matter of fact, while bi-variated relationships are at the core of most of our data analysis methods, there is still no principled theory to account for the different types of interactions that can occur between three or more variables. This talk explores the vast and largely unexplored territory of multivariate complexity, and discusses information-theoretic approaches that have been recently introduced to fill this important knowledge gap.
The first part of the talk is devoted to synergistic phenomena, which correspond to statistical regularities that affect the whole but not the parts. We explain how synergy can be effectively captured by information-theoretic measures inspired in the nature of high brain functions, and how these measures allow us to map complex interdependencies into hypergraphs. The second part of the talk focuses on a new theory of what constitutes causal emergence and how it can be measured from time series data. This theory enables a formal, quantitative account of downward causation, and introduces “causal decoupling” as a complementary modality of emergence. Importantly, this not only establishes conceptual tools to frame conjectures about emergence rigorously, but also provides practical procedures to test them on data. We illustrate the considered analysis tools on different case studies, including cellular automata, baroque music, flocking models, and neuroimaging datasets.
Short Bio: Fernando Rosas received the B.A. degree in music composition and minor degree in philosophy (2002), the B.Sc. degree in mathematics (2006), and the M.S. and Ph.D. degree in engineering sciences from the Pontificia Universidad Católica de Chile (PUC, 2012). He worked as postdoctoral researcher at KU Leuven (Belgium), the National Taiwan University (Taiwan), and Imperial College London (UK). He received the “Academic Award” given by the Department of Mathematics of the PUC for having the best academic performance of his promotion and was the recipient of a CONICYT Doctoral Fellowship from the Chilean Ministry of Education (2008), a “F+” Scholarship from KU Leuven (2014), and a Marie Słodowska-Curie Individual Fellowship from the European Union (2017). He is currently working as Postdoctoral Researcher at the Data Science Institute and the Centre for Psychedelic Research at Imperial College London. His research interests lay in the interface between data science & AI, complexity science, cognitive science, and neuroscience.
- Third Lecture Title: Information is Topology
Speaker: Pierre Baudot (Median Technologies– France) VIDEO
Abstract: Information theory, probability and statistical dependencies, and algebraic topology provide different views of a unified theory yet currently in development, where uncertainty goes as deep as Galois's ambiguity theory, topos and motivs. I will review some foundations led notably by Bennequin and Vigneaux, that characterize uniquely entropy as the first group of cohomology, on random variable complexes and probability laws. This framework allows to retrieve most of the usual information functions, like KL divergence, cross entropy, Tsallis entropies, differential entropy in different generality settings. Multivariate interaction/Mutual information (I_k and J_k) appear as coboundaries, and their negative minima, also called synergy, corresponds to homotopical link configurations, which at the image of Borromean links, illustrate what purely collective interactions or emergence can be. Those functions refine and characterize statistical independence in the multivariate case, in the sens that (X1,...,Xn) are independent iff all the I_k=0 (with 1<k<n+1, whereas for Total correlations G_k, it is sufficient that G_n=0), generalizing correlation coefficient. Concerning data analysis, restricting to the simplicial random variable structure sub-case, the application of the formalism to genetic transcription or to some classical benchmark dataset using open access infotopo library, unravels that higher statistical interactions are nonetheless omnipresent but also constitutive of biologically relevant assemblies. On the side of Machine learning, information cohomology provides a topological and combinatorial formalization of deep networks' supervised and unsupervised learning, where the depth of the layers is the simplicial dimension, derivation-propagation is forward (co-homological).
Short bio: Pierre Baudot was graduated in 1998 from Ecole Normale Supérieure Ulm magister of biology, and passed his PhD in electrophysiology of visual perception studying learning information coding in natural condition. He started to develop information topology with Daniel Bennequin at Complex System Institute and Mathematical Institute of Jussieu from 2006 to 2013, and then at the Max Planck Institute for Mathematic in the Science at Leipzig. He then joined Inserm at Marseille to develop data applications notably to transcriptomics. Since 2018, he works at Median Technologies, a medical imaging AI company, to detect and predict cancers from CT scans. He received the K2 trophy (mathematics and applications 2017), and best entropy paper prize 2019 for his contributions to topological information data analysis.