Geometric Science of Information
The objective of this group is to bring together pure/applied mathematicians, physicist and engineers, with common interest for Geometric tools and their applications. It notably aim to organize conferences and to promote collaborative european and international research projects, and diffuse research results on the related domains. It aims to organise conferences, seminar, to promote collaborative local, european and international research project, and to diffuse research results in the the different related interested domains.
Machine learning is an interdisciplinary field in the intersection of mathematical statistics and computer sciences. Machine learning studies statistical models and algorithms for deriving predictors, or meaningful patterns from empirical data. Machine learning techniques are applied in search engine, speech recognition and natural language processing, image detection, robotics etc. In our course we address the following questions:
What is the mathematical model of learning? How to quantify the difficulty/hardness/complexity of a learning problem? How to choose a learningmodel and learning algorithm? How to measure success of machine learning?
The syllabus of our course:
- Supervised learning, unsupervised learning
- Generalization ability of machine learning
- Support vector machine, Kernel machine
- Neural networks and deep learning
- Bayesian machine learning and Bayesian networks.
- S. Shalev-Shwart, and S. Ben-David, Understanding Machine Learning:
From Theory to Algorithms, Cambridge University Press, 2014.
- Sergios Theodoridis, Machine Learning A Bayesian and Optimization
Perspective, Elsevier, 2015.
- M. Mohri, A. Rostamizadeh, A. Talwalkar, Foundations of Machine
Learning, MIT Press, 2012.
- H. V. Lˆe, Mathematical foundations of machine learning, lecture note
During the course we shall discuss topics for term paper assignment which
could be qualified as the exam.
The first meeting shall take place at 10:40 AM Thursday October 2019, in the seminar room MU MFF UK (3rd floor). Anybody
interested in the lecture course please contact me per email hvle [ at] math.cas.cz
for arranging more suitable lecture time.
Location : Address: Institute of Mathematics of Czech Academy of Sciences, Zitna 25, 11567 Praha 1, Czech Republic
Lecture course (NMAG 469, Fall term 2019-2020)
- Mathematical foundations of machine learning The first meeting: Octobber 03, Thursday, 10.40-12.10, in the seminar room MU MFF UK (3rd floor).
- Learning, machine learning and artificial intelligence
1.1. Learning, inductive learning and machine learning
1.2. A brief history of machine learning
1.3. Current tasks and types of machine learning
1.4. Basic questions in mathematical foundations of machine
- Statistical models and frameworks for supervised learning
2.1. Discriminative model of supervised learning
2.2. Generative model of supervised learning
2.3. Empirical Risk Minimization and overfittig
- Statistical models and frameworks for unsupervised learning and
3.1. Statistical models and frameworks for density estimation
3.2. Statistical models and frameworks for clustering
3.3. Statistical models and frameworks for dimension reduction and
3.4. Statistical model and framework for reinforcement learning
- Fisher metric and maximum likelihood estimator
4.1. The space of all probability measures and total variation norm
4.2. Fisher metric on a statistical model
4.3. The Fisher metric, MSE and Cram´er-Rao inequality
4.4. Efficient estimators and MLE
4.5. Consistency of MLE
- Consistency of a learning algorithm
5.1. Consistent learning algorithm and its sample complexity
5.2. Uniformly consistent learning and VC-dimension
5.3. Fundamental theorem of binary classification
- Generalization ability of a learning machine and model selection
6.1. Covering number and sample complexity
6.2. Rademacher complexities and sample complexity
6.3. Model selection
- Support vector machines
7.1. Linear classifier and hard SVM
7.2. Soft SVM
7.3. Sample complexities of SVM
- Kernel based SVMs
8.1. Kernel trick
8.2. PSD kernels and reproducing kernel Hilbert spaces
8.3. Kernel based SVMs and their generalization ability
- Neural networks
9.1. Neural networks as computing devices
9.2. The expressive power of neural networks
9.3. Sample complexities of neural networks
- Training neural networks
10.1. Gradient and subgradient descend
10.2. Stochastic gradient descend (SGD)
10.3. Online gradient descend and online learnability
- Bayesian machine learning
11.1. Bayesian concept of learning
11.2. Estimating decisions using posterior distributions
11.3. Bayesian model selection
Appendix A. Some basic notions in probability theory
A.1. Dominating measures and the Radon-Nikodym theorem
A.2. Conditional expectation and regular conditional measure
A.3. Joint distribution and Bayes’ theorem
A.4. Transition measure, Markov kernel, and parameterized
Appendix B. Concentration-of-measure inequalities
B.1. Markov’s inequality
B.2. Hoeffding’s inequality
B.3. Bernstein’s inequality
B.4. McDiarmid’s inequality
- Jean-Louis Koszul -
(reed) 2019 Springer LINK Video
Offers a unique and unified overview of symplectic geometry, Highlights the differential properties of symplectic manifolds, Great interest for the emerging field of "Geometric Science of Information”
This introductory book offers a unique and unified overview of symplectic geometry, highlighting the differential properties of symplectic manifolds. It consists of six chapters: Some Algebra Basics, Symplectic Manifolds, Cotangent Bundles, Symplectic G-spaces, Poisson Manifolds, and A Graded Case, concluding with a discussion of the differential properties of graded symplectic manifolds of dimensions (0,n). It is a useful reference resource for students and researchers interested in geometry, group theory, analysis and differential equations. This book is also inspiring in the emerging field of Geometric Science of Information, in particular the chapter on Symplectic G-spaces, where Jean-Louis Koszul develops Jean-Marie Souriau's tools related to the non-equivariant case of co-adjoint action on Souriau’s moment map through Souriau’s Cocycle, opening the door to Lie Group Machine Learning with Souriau-Fisher metric.
- Jean-Louis Koszul -
Special Issue: "Lie Group Machine Learning and Lie Group Structure Preserving Integrators" Entropy MDPI
Machine/deep learning is exploring use-cases extensions for more abstract spaces such as graphs, differential manifolds, and structured data. The most recent fruitful exchanges between geometric science of information and Lie group theory have opened new perspectives to extend machine learning on Lie groups. After the Lie group’s foundation by Sophus Lie, Felix Klein, and Henri Poincaré, based on the Wilhelm Killing study of Lie algebra, Elie Cartan achieved the classification of simple real Lie algebras and introduced affine representation of Lie groups/algebras applied systematically by Jean-Louis Koszul. In parallel, the noncommutative harmonic analysis for non-Abelian groups has been addressed with the orbit method (coadjoint representation of group) with many contributors (Jacques Dixmier, Alexander Kirillov, etc.). In physics, Valentine Bargmann, Jean-Marie Souriau, and Bertram Kostant provided the basic concepts of Symplectic Geometry to Geometric Mechanics, such as the KKS symplectic form on coadjoint orbits and the notion of Momentum map associated to the action of a Lie group. Using these tools Souriau also developed the theory of Lie Group Thermodynamics based on coadjoint representations. These set of tools could be revisited in the framework of Lie group machine learning to develop new schemes for processing structured data.
Structure preserving integrators are numerical algorithms that are specifically designed to preserve the geometric properties of the flow of the differential equation, such as invariants, (multi-)symplecticity, volume preservation, as well as the configuration manifold. As a consequence, such algorithms have proven to be highly superior in correctly reproducing the global qualitative behavior of the system. Structure-preserving methods have recently undergone significant development and constitute today a privileged road in building numerical algorithms with high reliability and robustness in various areas of computational mathematics. In particular, the capability for long-term computation makes these methods particularly well adapted to deal with the new opportunities and challenges offered by scientific computations. Among the different ways to construct such numerical integrators, the application of variational principles (such as Hamilton’s variational principle and its generalizations) has appeared to be very powerful, since it is very constructive and because of its wide range of applicability.
An important specific situation encountered in a wide range of applications going from multibody dynamics to nonlinear beam dynamics and fluid mechanics is the case of ordinary and partial differential equations on Lie groups. In this case, one can additionally take advantage of the rich geometric structure of the Lie group and its Lie algebra for the construction of the integrators. Structure preserving integrators that preserve the Lie group structure have been studied from many points of view and with several extensions to a wide range of situations, including forced, controlled, constrained, nonsmooth, stochastic, or multiscale systems, in both the finite and infinite dimensional Lie group setting. They also naturally find applications in the extension of machine learning and deep learning algorithms to Lie group data.
This Special Issue will collect long versions of papers from contributions presented during the GSI'19 "Geometric Science of Information" conference (www.gsi2019.org) but will not be limited to these authors and is open to international communities involved in research on Lie groups machine learning and Lie group structure-preserving integrators.
- Prof. Frédéric Barbaresco
- Prof. Elena Cellodoni
- Prof. François Gay-Balmaz
- Prof. Joël Bensoam
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website . Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
- Lie groups machine learning
- orbits method
- symplectic geometry
- geometric integrator
- symplectic integrator
- Hamilton’s variational principle
The Conference on Stochastic Geometry is going to be held at the Euler Mathematical Institute on September 16 - 20, 2019
The Conference is organized and sponsored by:
- Euler Mathematical Institute
- Chebyshev Laboratory of St. Petersburg State University
The goal of the conference is to bring together the researchers who have experience in stochastic geometry and/or stochastic processes, to exchange ideas, and to stimulate new collaborations.
- I. Ibragimov
- Yu. Davydov
- D. Zaporozhets
Confirmed Invited Speakers (as of March 1, 2019):
- Alexander Bufetov
- Pierre Calka
- Nicolas Chenavier
- David Coupier
- Serguei Dachian
- Sergey Foss
- Friedrich Goetze
- Francesca Greselin
- Julian Grote
- Anna Gusakova
- Raphael Lachieze-Rey
- Guenter Last
- Alexander Litvak
- Julien Randon-Furling
- Zhan Shi
- Evgeny Spodarev
- Joseph Yukich
- Vladislav Vysotsky
- Elisabeth Werner
- Sergei Zuyev
Nadia Zaleskaya, Tatiana Vinogradova, Natalia Kirshner:
Les Rencontres de Probabilités 2019 à Rouen constituent un événement satellite du congrès International Congress on Industrial and Applied Mathematics à Valence. Il s'agit également de l'édition 2019 étendue à une semaine des Rencontres de Probabilités qui sont organisées chaque année à Rouen depuis près de 20 ans sur les thèmes de la mécanique statistique et des systèmes de particules.
Les thèmes principaux représentés cette année sont la géométrie aléatoire, l'analyse d'algorithmes et les systèmes de particules. Le programme comprendra des cours et des exposés sur chacun des trois domaines, ce qui permettra de réunir des spécialistes internationaux des différentes communautés et de renforcer les interactions entre elles. La participation des jeunes chercheurs est particulièrement encouragée.
L'inscription est gratuite mais obligatoire.
23-27 septembre 2019
Université de Rouen Normandie, site du Madrillet,
UFR des Sciences et Techniques, Amphi D
- Valentin Féray, Universität Zürich
- Claudio Landim, CNRS/Université de Rouen Normandie & IMPA Rio
- Dieter Mitsche, Université Jean-Monnet-Saint-Étienne
- Matthias Reitzner, Universität Osnabrück
- Cristina Toninelli, CNRS/Université Paris Dauphine
- Imre Bárány, Hungarian Academy of Sciences
- Peter Bürgisser, Technische Universität Berlin
- Vincent Cohen-Addad, CNRS/Sorbonne Université
- Giambattista Giacomin, Université Paris Diderot
- Patricia Gonçalves, Universidade de Lisbon
- Jean-Baptiste Gouéré, Université de Tours
- Thierry Lévy, Sorbonne Université
- Ralph Neininger, Goethe-Universität Frankfurt
- Cyril Nicaud, Université Paris-Est Marne-la-Vallée
- Frank Redig, Technische Universiteit Delft
- Viet Chi Tran, Université de Lille
- Dimitrios Tsagkarogiannis, Università dell’Aquila
- Thierry Bodineau, CNRS/École Polytechnique
- Anna de Masi, Universita di L'Aquila
- Jean-François Marckert, CNRS/Université de Bordeaux
- Brigitte Vallée, CNRS/Université de Caen Normandie
- Joseph E. Yukich, Lehigh University
- Pierre Calka, Université de Rouen Normandie
- Nathanaël Enriquez, Université Paris-Sud
- Xavier Goaoc, Université de Lorraine
- Mustapha Mourragui, Université de Rouen Normandi
- Ellen Saada, CNRS/Université Paris Descartes
- Edwige Auvray, CNRS/Université de Rouen Normandie
- Pierre Calka, Université de Rouen Normandie,
- Nicolas Forcadel, INSA Rouen Normandie
- Sandrine Halé, Université de Rouen Normandie
- Mustapha Mourragui, Université de Rouen Normandie
- Hamed Smail, Université de Rouen Normandie
Two NSF/NIH funded postdoc positions are available in the following fields:
-Computational or applied topology/geometry/graph/algebra
-Machine learning/deep learning
-AI-based drug design and discovery
Ideal candidates should have experience in code development, have demonstrated the potential for excellence in research, and hold a recent Ph.D. degree in either mathematics, computer science, computational biophysics, computational chemistry, or bioinformatics. The selected candidates will be teamed up with top performers in recent D3R Grand Challenges, a worldwide competition series in computer-aided drug design. Salary depends on experience but will be at least $47.5k. The positions enjoy standard faculty health benefit. Please send CV to weig [at] msu.edu.
2 Ph.D. students in Machine Learning at Lund University (Sweden).
Position type: Ph.D. scholarship
Research area: Machine Learning
Start: September 2019 or later
Duration: 4 years
Where: Lund University - CS department
Application closing date: July 29, 2019
Supervisor: Professor Luigi Nardi (luigi.nardi [at] cs.lth.se)
Position description: https://lu.varbi.com/en/what:job/jobID:280143/
These projects, financed by WASP (Wallenberg AI, Autonomous System and Software Programme), aim at introducing innovative algorithms and methodologies to overcome the limitations of multi-objective black-box optimization. They are part of a collaboration with Stanford University. The students will be encouraged to apply to the WASP exchange program with Stanford to work closely with collaborators.
Topics of interest:
Derivative-free optimization (DFO)
Algorithm configuration and selection
Automated machine learning (AutoML)
Neural architecture search (NAS)
Learning to learn
Meta learning and transfer learning
Reinforcement learning (RL)
Optimization of neural networks
Evolutionary algorithms (EA)
Discrete optimization and NP-hard problem solving
Data-driven analysis of algorithms, hyperparameter importance, etc.
Some applications of interest:
Image classification, Natural Language Processing (NLP), Simultaneous localization and mapping (SLAM), Design space exploration (DSE), Optimizing compilers, Hardware design: CPU, GPU, FPGA, CGRA, ASIC.
Open post-doc position at Géoazur in collaboration with Inria, at Sophia
Antipolis, France, in the research area: Curvilinear network detection
on satellite images using AI, stochastic models and deep learning.
EXTENDED Submission deadline July 31, 2019
Open Position for a post-doc scientist at Géoazur
(https://geoazur.oca.eu/fr/acc-geoazur) in collaboration with Inria
(https://www.inria.fr/en/centre/sophia), at Sophia Antipolis (Nice
region), France, in the area of Computer Vision, Deep Learning and
Remote Sensing applied to curvilinear detection on both optical and SAR
satellite images (project abstract below).
Both Geoazur and Inria Sophia Antipolis are ideally located in the heart
of the French Riviera, inside the multi-cultural silicon valley of
Europe (ie. Sophia-Antipolis, see
This position is funded by University Côte d'Azur (UCA, see
Duration: 18 months
Starting date: between September 1st and December 1st 2019.
Salary: gross salary per month 3000 EUR (ie. approximately 2400 EUR net)
Please see full announcement
or on https://euraxess.ec.europa.eu/jobs/411481
Strong academic backgrounds in Stochastic Modeling, Deep Learning,
Computer Vision, Remote Sensing and Parallel Programming with GPUs
and/or multicore CPUs. A decent knowledge of Earth and telluric features
(especially faults) will be appreciated.
To apply, please email a full application to both Isabelle Manighetti
(manighetti[at]geoazur.unice.fr) and Josiane Zerubia
(josiane.Zerubia[at]inria.fr), indicating “UCA-AI-post-doc” in the e-mail
The application should contain:
- a motivation letter demonstrating motivation, academic strengths
and related experience to this position.
- CV including publication list
- at least two major publications in pdf
- minimum 2 reference letters
Curvilinear structure networks are widespread in both nature and
anthropogenic systems, ranging from angiography, earth and environment
sciences, to biology and anthropogenic activities. Recovering the
existence and architecture of these curvilinear networks is an essential
and fundamental task in all the related domains. At present, there has
been an explosion of image data documenting these curvilinear structure
networks. Therefore, it is of upmost importance to develop numerical
approaches that may assist us efficiently to automatically extract
curvilinear networks from image data.
In recent years, a bulk of works have been proposed to extract
curvilinear networks. However, automated and high-quality curvilinear
network extraction is still a challenging task nowadays. This is mainly
due to the network shape complexity, low-contrast in images, and high
annotation cost for training data. To address the problems aroused by
these difficulties, this project intends to develop a novel,
minimally-supervised curvilinear network extraction method by combining
deep neural networks with active learning, where the deep neural
networks are employed to automatically learn hierarchical and
data-driven features of curvilinear networks, and the active learning is
exploited to achieve high-quality extraction using as few annotations as
possible. Furthermore, composite and hierarchical heuristic rules will
be designed to constrain the geometry of curvilinear structures and
guide the curvilinear graph growing.
The proposed approach will be tested and validated on extraction of
tectonic fractures and faults from a dense collection of satellite and
aerial data and “ground truth” available at the Géoazur laboratory in
the framework of the Faults_R_Gems project co-funded by the University
Côte d’Azur (UCA) and the French National Research Agency (ANR). Then we
intend to apply the new automatic extraction approaches to other
scenarios, as road extraction in remote sensing images of the Nice
region, and blood vessel extraction in available medical image databases.
- a motivation letter demonstrating motivation, academic strengths