Summer School - Information theory: Inequalities, distances and analysis - June 25-29 2018 - Intitut Henri Poincaré, Paris, France
-
OFFICIAL WEBSITE - REGISTRATION
Presentation
This Summer School will consist in two courses given by professors Sergey Bobkov (Minneapolis) and Mokshay Madiman (Delaware) on Information Theory and Convex Analysis. The aim is to bring researchers from different communities (Probability, Analysis, Computer science) in the same place.
Participation of postdocs and PhD students is strongly encouraged. The school has some (limited number of) grants for young people (see the "registration" link above).Abstracts
Sergey Bobkov (Minneapolis) : Strong probability distances and limit theorems
Abstract: The lectures explore strong distances in the space of probability distributions, including total variation, relative entropy, chi squared and more general Renyi/Tsallis informational divergences, as well as relative Fisher information. Special attention is given to the distances from the normal law. The first part of the course is devoted to the general theory, and the second part to the behavior of such informational distances along convolutions and associated central limit theorem.Mokshay Madiman (Delaware): Entropy power and related inequalities in continuous and discrete settings
Abstract: The lectures explore the behavior of Renyi entropies of convolutions of probability measures for a variety of ambient spaces. The first part of the course focuses on Euclidean spaces, beginning with the classical Shannon-Stam entropy power inequality and the closely related Brunn-Minkowski inequality, and developing several of the generalizations, variants, and reversals of these inequalities. The second part of the course focuses on discrete abelian groups, where one sees close connections to additive combinatorics.Related Materials
(1) Survey on entropic limit theorems (M. Madiman) :
- Lecture 1. Introduction - What is information theory? The first question that we want to address is: “What is information?” Although there are several ways in which we might think of answering this question, the main rationale behind our approach is to distinguish information from data. We think of information as something abstract that we want to convey, while we think of data as a representation of information, something that is storable/communicable. This is best understood by some examples.
- Lecture 2. Basics / law of small numbers. Due to scheduling considerations, we postpone the proof of the entropic central limit theorem. In this lecture, we discuss basic properties of the entropy and illustrate them by proving a simple version of the law of small numbers (Poisson limit theorem). The next lecture will be devoted to Sanov’s theorem. We will return to the entropic central limit theorem in Lecture 4.
- Lecture 3. Sanov’s theorem. The goal of this lecture is to prove one of the most basic results in large deviations theory. Our motivations are threefold: 1. It is an example of a probabilistic question where entropy naturally appears. 2.The proof we give uses ideas typical in information theory. 3. We will need it later to discuss the transportation-information inequalities (if we get there).
- Lecture 4. Entropic CLT (1). The subject of the next lectures will be the entropic central limit theorem (entropic CLT) and its proof.
- Lecture 5. Entropic CLT (2). The goal of this lecture is to prove monotonicity of Fisher information in the central limit theorem. Next lecture we will connect Fisher information to entropy, completing the proof of the entropic CLT.
- Lecture 6. Entropic CLT (3). In this lecture, we complete the proof of monotonicity of the Fisher information in the CLT, and begin developing the connection with entropy. The entropic CLT will be completed in the next lecture.
- Lecture 7. Entropic CLT (4). This lecture completes the proof of the entropic central limit theorem.
- Lecture 8. Entropic cone and matroids. This lecture introduces the notion of the entropic cone and its connection with entropy inequalities.
- Lecture 9. Concentration, information, transportation (1). The goal of the next two lectures is to explore the connections between concentration of measure, entropy inequalities, and optimal transportation.
- Lecture 10. Concentration, information, transportation (2) Recall the main proposition proved in the previous lecture, which is due to Bobkov and Götze (1999).
(2) A survey on forward and reverse entropy power inequalities, 2017.
Organizers
Sponsors