PROGRAM - VIDEOS - SLIDES -Wednesday 26 october 2016
Mercredi 26 octobre 2016
János Körner (Université de Rome la Sapienza)
Shannon’s legacy in combinatorics
Abstract: In 1956 Claude Shannon defined the zero-‐error capacity of a discrete memoryless channel. He realized that the problem of its determination is immensely difficult and a simple formula for this capacity might not exist. Nevertheless, through the fortunate notion of perfect graphs of Claude Berge inspired by the concept of Shannon a new and beautiful chapter in graph theory was born. More generally, Shannon’s notion leads to a theory about the asymptotic behaviour of basic invariants in product structures. We explore the resulting merger of combinatorics and information theory.
János Körner was born in Budapest on November 30, 1946. He got his degree in mathematics from Eötvös Loránd University, Budapest in 1970. He was a member of the Rényi Institute of Mathematics of the Hungarian Academy of Sciences from 1970 to 1991. He has been working in Italy from 1992 where he became a professor at Sapienza University of Rome in 1994. His research is concerned with Information Theory and Extremal Combinatorics. He is a Honorary Member of the Rényi Institute of Mathematics. In 2010 he got elected to the Hungarian Academy of Sciences as an External Member. In 2014 he obtained the Claude Shannon Award of the Information Theory Society of the IEEE.
Elisabeth Gassiat (Université de Paris-Sud)
Entropie, compression et statistique
Abstract: Claude Shannon est l'inventeur de la théorie de l'information. Il a introduit la notion d'entropie comme mesure de l'information contenue dans un message vu comme provenant d'une source stochastique et démontré son lien avec les possibilités de compression de ce message. Le lien avec les méthodes statistiques est profond et la théorie de l'information a été source d'inspiration pour les théoriciens des statistiques. La compression de sources dont le nombre de valeurs possibles est grand peut être comprise comme un problème de statistique adaptative. Quelques résultats récents de codage adaptatif seront présentés.
E. Gassiat est diplômée de l'école Polytechnique, a soutenu une thèse de mathématiques en 1988 à l'Université Paris-‐Sud et obtenu son HDR en 1993. Elle est professeur à l'Université Paris-‐Sud (Orsay) depuis 1998.
Anne Canteaut (INRIA)
Comment concevoir un algorithme de chiffrement sûr et efficace: l'héritage de Shannon
Abstract: Dans son article fondateur publié en 1949 posant les fondements de la cryptographie, Claude Shannon a énoncé deux méthodes de conception visant à éviter les attaques statistiques : la diffusion et la confusion. Ces deux techniques sont toujours au cœur des algorithmes symétriques modernes et permettent d'optimiser leur résistance aux cryptanalyses les plus connues, notamment aux cryptanalyses différentielle et linéaire. Ces principes ont par exemple présidé à la conception du standard actuel de chiffrement symétrique, l'AES, et ont motivé de nombreuses recherches liant cryptographie et mathématiques discrètes.
Anne Canteaut est directrice de recherche à l'Inria de Paris, et responsable scientifique de l'équipe-projet SECRET. Son domaine de recherche est la cryptographie symétrique. Ses travaux portent à la fois sur la conception de nouveaux systèmes cryptographiques, l'attaque de systèmes existants, et l'étude des objets mathématiques mis en jeu.
Mérouane Debbah (CentraleSupélec et Huawei France R&D)
Random Matrices and Telecommunications
Abstract: The asymptotic behaviour of the eigenvalues of large random matrices has been extensively studied since the fifties. One of the first related result was the work of Eugène Wigner in 1955 who remarked that the eigenvalue distribution of a standard Gaussian hermitian matrix converges to a deterministic probability distribution called the semi-circular law when the dimensions of the matrix converge to infinity. Since that time, the study of the eigenvalue distribution of random matrices has triggered numerous works, in the theoretical physics as well as probability theory communities. However, as far as communications systems are concerned, until the mid 90's, intensive simulations were thought to be the only technique to get some insight on how communications behave with many parameters. All this changed in 2000 when large system analysis based on random matrix theory was discovered as an appropriate tool to gain intuitive insight into communication systems. In particular, the self-averaging effect of random matrices was shown to be able to capture the parameters of interest of communication schemes. Since then, the results led to very active research in many fields such as MIMO systems or Ultra-Dense Networks. This talk is intended to give a comprehensive overview of random matrices and their application to the latest design of 5G Networks.
Mérouane Debbah entered the Ecole Normale Supérieure de Cachan (France) in 1996 where he received his M.Sc and Ph.D. degrees respectively. Since 2007, he is a Full Professor at CentraleSupelec (Gif-sur-Yvette, France). From 2007 to 2014, he was the director of the Alcatel-Lucent Chair on Flexible Radio. Since 2014, he is Vice-President of the Huawei France R&D center and director of the Mathematical and Algorithmic Sciences Lab. His research interests lie in fundamental mathematics, algorithms, statistics, information & communication sciences research. M. Debbah is a recipient of the ERC grant MORE (Advanced Mathematical Tools for Complex Network Engineering). He is a IEEE Fellow and a WWRF Fellow. In his career, he received more than 16 Best Paper Awards, the latest being the 2015 IEEE Communications Society Leonard G. Abraham Prize, the 2015 IEEE Communications Society Fred W. Ellersick Prize as well as the 2016 IEEE Communications Society Best Tutorial paper award.
Olivier Rioul (Télécom-ParisTech)
Shannon’s Formula Wlog(1+SNR): A Historical Perspective
Abstract: As is well known, the milestone event that founded the field of information theory is the publication of Shannon’s 1948 paper entitled "A Mathematical Theory of Communication". This article brings together so many fundamental advances and strokes of genius that Shannon has become the hero of thousands of researchers, praised almost as a deity. One can say without exaggeration that Shannon's theorems are the mathematical theorems which have made possible the digital world as we know it today. We first describe some of his most outstanding contributions, culminating with Shannon's emblematic capacity formula C = W.log(1+P/N) where W is the channel bandwidth and P/N is the channel signal-to-noise ratio (SNR). Incidentally, Hartley’s name is often associated with the same formula, owing to "Hartley’s rule": Counting the highest possible number of distinguishable values for a given amplitude A and precision D yields a similar expression log(1 + A/D). In the information theory community, the following "historical" statements are generally well accepted:
(1) Hartley put forth his rule in 1928, twenty years before Shannon;
(2) Shannon’s formula as a fundamental trade-off between transmission rate, bandwidth, and signal-to-noise ratio came unexpected in 1948;
(3) Shannon’s formula is exact while Hartley’s rule is imprecise;
(4) Hartley’s expression is not an appropriate formula for the capacity of a communication channel.
We show that all these four statements are somewhat wrong:
(1) Hartley’s rule does not seem to be Hartley’s.
(2) At least seven other authors have independently derived formulas very similar to Shannon’s in the same year 1948 — the earliest published original contribution being a Note at the Académie des Sciences by a French engineer Jacques Laplume.
(3) A careful calculation shows that Hartley’s rule does coincide with Shannon’s formula.
(4) Hartley’s rule is in fact mathematically correct as the capacity of a communication channel, where the noise is not Gaussian but uniform, and the signal limitation is not on the power but on the amplitude.
(This talk was presented in part at the MaxEnt 2014 conference in Amboise as a joint work with José Carlos Magossi (Unicamp, São Paulo State, Brasil)).
Olivier Rioul (PhD, HDR) is professor at Télécom ParisTech and École Polytechnique, France. His research interests are in applied mathematics and include various, sometimes unconventional, applications of information theory such as inequalities in statistics, hardware security, and experimental psychology.