Drag and Drop a photo, drag to position, and hit Save

Group Details Private


  • Geo-Sci-Info



    Keynote Speaker:

    • Jean-Baptiste Hiriart-Urruty: Pierre de FERMAT (ca. 1605-1665): lawyer, philologist and illustrious mathematician ... but enigmatic

    posted in GSI2019 read more
  • Geo-Sci-Info


    Associate Prof. Dr.Sc. Hông Vân Lê

    DOWNLOAD PDF of the flyer of the course
    DOWNLOAD PDF of the LECTURE NOTES of the course

    Machine learning is an interdisciplinary field in the intersection of mathematical statistics and computer sciences. Machine learning studies statistical models and algorithms for deriving predictors, or meaningful patterns from empirical data. Machine learning techniques are applied in search engine, speech recognition and natural language processing, image detection, robotics etc. In our course we address the following questions:
    What is the mathematical model of learning? How to quantify the difficulty/hardness/complexity of a learning problem? How to choose a learningmodel and learning algorithm? How to measure success of machine learning?
    The syllabus of our course:

    1. Supervised learning, unsupervised learning
    2. Generalization ability of machine learning
    3. Support vector machine, Kernel machine
    4. Neural networks and deep learning
    5. Bayesian machine learning and Bayesian networks.

    Recommended Literature.

    1. S. Shalev-Shwart, and S. Ben-David, Understanding Machine Learning:
      From Theory to Algorithms, Cambridge University Press, 2014.
    2. Sergios Theodoridis, Machine Learning A Bayesian and Optimization
      Perspective, Elsevier, 2015.
    3. M. Mohri, A. Rostamizadeh, A. Talwalkar, Foundations of Machine
      Learning, MIT Press, 2012.
    4. H. V. Lˆe, Mathematical foundations of machine learning, lecture note

    During the course we shall discuss topics for term paper assignment which
    could be qualified as the exam.

    The first meeting shall take place at 10:40 AM Thursday October 2019, in the seminar room MU MFF UK (3rd floor). Anybody
    interested in the lecture course please contact me per email hvle [ at]
    for arranging more suitable lecture time.

    Location : Address: Institute of Mathematics of Czech Academy of Sciences, Zitna 25, 11567 Praha 1, Czech Republic


    Lecture course (NMAG 469, Fall term 2019-2020)

    • Mathematical foundations of machine learning The first meeting: Octobber 03, Thursday, 10.40-12.10, in the seminar room MU MFF UK (3rd floor).


    1. Learning, machine learning and artificial intelligence
      1.1. Learning, inductive learning and machine learning
      1.2. A brief history of machine learning
      1.3. Current tasks and types of machine learning
      1.4. Basic questions in mathematical foundations of machine
      1.5. Conclusion
    2. Statistical models and frameworks for supervised learning
      2.1. Discriminative model of supervised learning
      2.2. Generative model of supervised learning
      2.3. Empirical Risk Minimization and overfittig
      2.4. Conclusion
    3. Statistical models and frameworks for unsupervised learning and
      reinforcement learning
      3.1. Statistical models and frameworks for density estimation
      3.2. Statistical models and frameworks for clustering
      3.3. Statistical models and frameworks for dimension reduction and
      manifold learning
      3.4. Statistical model and framework for reinforcement learning
      3.5. Conclusion
    4. Fisher metric and maximum likelihood estimator
      4.1. The space of all probability measures and total variation norm
      4.2. Fisher metric on a statistical model
      4.3. The Fisher metric, MSE and Cram´er-Rao inequality
      4.4. Efficient estimators and MLE
      4.5. Consistency of MLE
      4.6. Conclusion
    5. Consistency of a learning algorithm
      5.1. Consistent learning algorithm and its sample complexity
      5.2. Uniformly consistent learning and VC-dimension
      5.3. Fundamental theorem of binary classification
      5.4. Conclusions
    6. Generalization ability of a learning machine and model selection
      6.1. Covering number and sample complexity
      6.2. Rademacher complexities and sample complexity
      6.3. Model selection
      6.4. Conclusion
    7. Support vector machines
      7.1. Linear classifier and hard SVM
      7.2. Soft SVM
      7.3. Sample complexities of SVM
      7.4. Conclusion
    8. Kernel based SVMs
      8.1. Kernel trick
      8.2. PSD kernels and reproducing kernel Hilbert spaces
      8.3. Kernel based SVMs and their generalization ability
      8.4. Conclusion
    9. Neural networks
      9.1. Neural networks as computing devices
      9.2. The expressive power of neural networks
      9.3. Sample complexities of neural networks
      9.4. Conclusion
    10. Training neural networks
      10.1. Gradient and subgradient descend
      10.2. Stochastic gradient descend (SGD)
      10.3. Online gradient descend and online learnability
      10.4. Conclusion
    11. Bayesian machine learning
      11.1. Bayesian concept of learning
      11.2. Estimating decisions using posterior distributions
      11.3. Bayesian model selection
      11.4. Conclusion
      Appendix A. Some basic notions in probability theory
      A.1. Dominating measures and the Radon-Nikodym theorem
      A.2. Conditional expectation and regular conditional measure
      A.3. Joint distribution and Bayes’ theorem
      A.4. Transition measure, Markov kernel, and parameterized
      statistical model
      Appendix B. Concentration-of-measure inequalities
      B.1. Markov’s inequality
      B.2. Hoeffding’s inequality
      B.3. Bernstein’s inequality
      B.4. McDiarmid’s inequality

    posted in Mathematical Foundations of Machine Leaning - Online Course - Hong Van Le read more
  • Geo-Sci-Info

    • Introduction to Symplectic Geometry Jean-Louis Koszul -
      (reed) 2019 Springer LINK Video
      Offers a unique and unified overview of symplectic geometry, Highlights the differential properties of symplectic manifolds, Great interest for the emerging field of "Geometric Science of Information”
      This introductory book offers a unique and unified overview of symplectic geometry, highlighting the differential properties of symplectic manifolds. It consists of six chapters: Some Algebra Basics, Symplectic Manifolds, Cotangent Bundles, Symplectic G-spaces, Poisson Manifolds, and A Graded Case, concluding with a discussion of the differential properties of graded symplectic manifolds of dimensions (0,n). It is a useful reference resource for students and researchers interested in geometry, group theory, analysis and differential equations. This book is also inspiring in the emerging field of Geometric Science of Information, in particular the chapter on Symplectic G-spaces, where Jean-Louis Koszul develops Jean-Marie Souriau's tools related to the non-equivariant case of co-adjoint action on Souriau’s moment map through Souriau’s Cocycle, opening the door to Lie Group Machine Learning with Souriau-Fisher metric.

    posted in Preprints - Books - Archivs - Journal special edition (Entropy...) read more
  • Geo-Sci-Info

    Capture du 2019-08-05 11-09-36.png

    Special Issue: "Lie Group Machine Learning and Lie Group Structure Preserving Integrators" Entropy MDPI


    Download Flyer

    Machine/deep learning is exploring use-cases extensions for more abstract spaces such as graphs, differential manifolds, and structured data. The most recent fruitful exchanges between geometric science of information and Lie group theory have opened new perspectives to extend machine learning on Lie groups. After the Lie group’s foundation by Sophus Lie, Felix Klein, and Henri Poincaré, based on the Wilhelm Killing study of Lie algebra, Elie Cartan achieved the classification of simple real Lie algebras and introduced affine representation of Lie groups/algebras applied systematically by Jean-Louis Koszul. In parallel, the noncommutative harmonic analysis for non-Abelian groups has been addressed with the orbit method (coadjoint representation of group) with many contributors (Jacques Dixmier, Alexander Kirillov, etc.). In physics, Valentine Bargmann, Jean-Marie Souriau, and Bertram Kostant provided the basic concepts of Symplectic Geometry to Geometric Mechanics, such as the KKS symplectic form on coadjoint orbits and the notion of Momentum map associated to the action of a Lie group. Using these tools Souriau also developed the theory of Lie Group Thermodynamics based on coadjoint representations. These set of tools could be revisited in the framework of Lie group machine learning to develop new schemes for processing structured data.

    Structure preserving integrators are numerical algorithms that are specifically designed to preserve the geometric properties of the flow of the differential equation, such as invariants, (multi-)symplecticity, volume preservation, as well as the configuration manifold. As a consequence, such algorithms have proven to be highly superior in correctly reproducing the global qualitative behavior of the system. Structure-preserving methods have recently undergone significant development and constitute today a privileged road in building numerical algorithms with high reliability and robustness in various areas of computational mathematics. In particular, the capability for long-term computation makes these methods particularly well adapted to deal with the new opportunities and challenges offered by scientific computations. Among the different ways to construct such numerical integrators, the application of variational principles (such as Hamilton’s variational principle and its generalizations) has appeared to be very powerful, since it is very constructive and because of its wide range of applicability.

    An important specific situation encountered in a wide range of applications going from multibody dynamics to nonlinear beam dynamics and fluid mechanics is the case of ordinary and partial differential equations on Lie groups. In this case, one can additionally take advantage of the rich geometric structure of the Lie group and its Lie algebra for the construction of the integrators. Structure preserving integrators that preserve the Lie group structure have been studied from many points of view and with several extensions to a wide range of situations, including forced, controlled, constrained, nonsmooth, stochastic, or multiscale systems, in both the finite and infinite dimensional Lie group setting. They also naturally find applications in the extension of machine learning and deep learning algorithms to Lie group data.

    This Special Issue will collect long versions of papers from contributions presented during the GSI'19 "Geometric Science of Information" conference ( but will not be limited to these authors and is open to international communities involved in research on Lie groups machine learning and Lie group structure-preserving integrators.

    • Prof. Frédéric Barbaresco
    • Prof. Elena Cellodoni
    • Prof. François Gay-Balmaz
    • Prof. Joël Bensoam
      Guest Editors

    Manuscript Submission Information

    Manuscripts should be submitted online at by registering and logging in to this website . Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

    Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

    Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.


    • Lie groups machine learning
    • orbits method
    • symplectic geometry
    • geometric integrator
    • symplectic integrator
    • Hamilton’s variational principle

    posted in Preprints - Books - Archivs - Journal special edition (Entropy...) read more
  • Geo-Sci-Info

    Capture du 2019-07-16 12-43-26.png

    The Conference on Stochastic Geometry is going to be held at the Euler Mathematical Institute on September 16 - 20, 2019

    The Conference is organized and sponsored by:

    • Euler Mathematical Institute
    • Chebyshev Laboratory of St. Petersburg State University

    The goal of the conference is to bring together the researchers who have experience in stochastic geometry and/or stochastic processes, to exchange ideas, and to stimulate new collaborations.

    Organizing committee:

    • I. Ibragimov
    • Yu. Davydov
    • D. Zaporozhets

    Confirmed Invited Speakers (as of March 1, 2019):

    • Alexander Bufetov
    • Pierre Calka
    • Nicolas Chenavier
    • David Coupier
    • Serguei Dachian
    • Sergey Foss
    • Friedrich Goetze
    • Francesca Greselin
    • Julian Grote
    • Anna Gusakova
    • Raphael Lachieze-Rey
    • Guenter Last
    • Alexander Litvak
    • Julien Randon-Furling
    • Zhan Shi
    • Evgeny Spodarev
    • Joseph Yukich
    • Vladislav Vysotsky
    • Elisabeth Werner
    • Sergei Zuyev

    Local coordinators:
    Nadia Zaleskaya, Tatiana Vinogradova, Natalia Kirshner:

    posted in Stochastic Geometry read more
  • Geo-Sci-Info

    Capture du 2019-07-16 12-25-33.png


    Les Rencontres de Probabilités 2019 à Rouen constituent un événement satellite du congrès International Congress on Industrial and Applied Mathematics à Valence. Il s'agit également de l'édition 2019 étendue à une semaine des Rencontres de Probabilités qui sont organisées chaque année à Rouen depuis près de 20 ans sur les thèmes de la mécanique statistique et des systèmes de particules.

    Les thèmes principaux représentés cette année sont la géométrie aléatoire, l'analyse d'algorithmes et les systèmes de particules. Le programme comprendra des cours et des exposés sur chacun des trois domaines, ce qui permettra de réunir des spécialistes internationaux des différentes communautés et de renforcer les interactions entre elles. La participation des jeunes chercheurs est particulièrement encouragée.

    L'inscription est gratuite mais obligatoire.

    Dates :

    23-27 septembre 2019

    Lieu :

    Université de Rouen Normandie, site du Madrillet,

    UFR des Sciences et Techniques, Amphi D

    Télécharger l'affiche



    • Valentin Féray, Universität Zürich
    • Claudio Landim, CNRS/Université de Rouen Normandie & IMPA Rio
    • Dieter Mitsche, Université Jean-Monnet-Saint-Étienne
    • Matthias Reitzner, Universität Osnabrück
    • Cristina Toninelli, CNRS/Université Paris Dauphine

    Exposés pléniers

    • Imre Bárány, Hungarian Academy of Sciences
    • Peter Bürgisser, Technische Universität Berlin
    • Vincent Cohen-Addad, CNRS/Sorbonne Université
    • Giambattista Giacomin, Université Paris Diderot
    • Patricia Gonçalves, Universidade de Lisbon
    • Jean-Baptiste Gouéré, Université de Tours
    • Thierry Lévy, Sorbonne Université
    • Ralph Neininger, Goethe-Universität Frankfurt
    • Cyril Nicaud, Université Paris-Est Marne-la-Vallée
    • Frank Redig, Technische Universiteit Delft
    • Viet Chi Tran, Université de Lille
    • Dimitrios Tsagkarogiannis, Università dell’Aquila


    Comité scientifique

    • Thierry Bodineau, CNRS/École Polytechnique
    • Anna de Masi, Universita di L'Aquila
    • Jean-François Marckert, CNRS/Université de Bordeaux
    • Brigitte Vallée, CNRS/Université de Caen Normandie
    • Joseph E. Yukich, Lehigh University

    Comité d'organisation

    • Pierre Calka, Université de Rouen Normandie
    • Nathanaël Enriquez, Université Paris-Sud
    • Xavier Goaoc, Université de Lorraine
    • Mustapha Mourragui, Université de Rouen Normandi
    • Ellen Saada, CNRS/Université Paris Descartes

    Équipe locale

    • Edwige Auvray, CNRS/Université de Rouen Normandie
    • Pierre Calka, Université de Rouen Normandie,
    • Nicolas Forcadel, INSA Rouen Normandie
    • Sandrine Halé, Université de Rouen Normandie
    • Mustapha Mourragui, Université de Rouen Normandie
    • Hamed Smail, Université de Rouen Normandie

    Capture du 2019-07-16 12-31-38.png

    posted in Rouen Probability Meeting read more
  • Geo-Sci-Info

    Two NSF/NIH funded postdoc positions are available in the following fields:

    -Computational or applied topology/geometry/graph/algebra

    -Machine learning/deep learning

    -AI-based drug design and discovery

    -Computational biophysics

    Ideal candidates should have experience in code development, have demonstrated the potential for excellence in research, and hold a recent Ph.D. degree in either mathematics, computer science, computational biophysics, computational chemistry, or bioinformatics. The selected candidates will be teamed up with top performers in recent D3R Grand Challenges, a worldwide competition series in computer-aided drug design. Salary depends on experience but will be at least $47.5k. The positions enjoy standard faculty health benefit. Please send CV to weig [at]

    posted in Jobs offers - Call for projects read more
  • Geo-Sci-Info

    2 Ph.D. students in Machine Learning at Lund University (Sweden).

    Position type: Ph.D. scholarship
    Research area: Machine Learning
    Start: September 2019 or later
    Duration: 4 years
    Where: Lund University - CS department
    Application closing date: July 29, 2019
    Supervisor: Professor Luigi Nardi (luigi.nardi [at]
    Position description:

    These projects, financed by WASP (Wallenberg AI, Autonomous System and Software Programme), aim at introducing innovative algorithms and methodologies to overcome the limitations of multi-objective black-box optimization. They are part of a collaboration with Stanford University. The students will be encouraged to apply to the WASP exchange program with Stanford to work closely with collaborators.

    Apply here.
    Topics of interest:

    Black-box optimization
    Derivative-free optimization (DFO)
    Bayesian optimization
    Algorithm configuration and selection
    Active learning
    Automated machine learning (AutoML)
    Neural architecture search (NAS)
    Hyperparameter optimization
    Learning to learn
    Meta learning and transfer learning
    Reinforcement learning (RL)
    Optimization of neural networks
    Evolutionary algorithms (EA)
    Discrete optimization and NP-hard problem solving
    Data-driven analysis of algorithms, hyperparameter importance, etc.

    Some applications of interest:
    Image classification, Natural Language Processing (NLP), Simultaneous localization and mapping (SLAM), Design space exploration (DSE), Optimizing compilers, Hardware design: CPU, GPU, FPGA, CGRA, ASIC.

    posted in Jobs offers - Call for projects read more
  • Geo-Sci-Info

    Open post-doc position at Géoazur in collaboration with Inria, at Sophia
    Antipolis, France, in the research area: Curvilinear network detection
    on satellite images using AI, stochastic models and deep learning.

    EXTENDED Submission deadline July 31, 2019

    Open Position for a post-doc scientist at Géoazur
    ( in collaboration with Inria
    (, at Sophia Antipolis (Nice
    region), France, in the area of Computer Vision, Deep Learning and
    Remote Sensing applied to curvilinear detection on both optical and SAR
    satellite images (project abstract below).
    Both Geoazur and Inria Sophia Antipolis are ideally located in the heart
    of the French Riviera, inside the multi-cultural silicon valley of
    Europe (ie. Sophia-Antipolis, see

    This position is funded by University Côte d'Azur (UCA, see

    Duration: 18 months
    Starting date: between September 1st and December 1st 2019.
    Salary: gross salary per month 3000 EUR (ie. approximately 2400 EUR net)

    Please see full announcement,
    or on

    Candidate profile

    Strong academic backgrounds in Stochastic Modeling, Deep Learning,
    Computer Vision, Remote Sensing and Parallel Programming with GPUs
    and/or multicore CPUs. A decent knowledge of Earth and telluric features
    (especially faults) will be appreciated.

    To apply, please email a full application to both Isabelle Manighetti
    (manighetti[at] and Josiane Zerubia
    (josiane.Zerubia[at], indicating “UCA-AI-post-doc” in the e-mail

    The application should contain:

    • a motivation letter demonstrating motivation, academic strengths
      and related experience to this position.
    • CV including publication list
    • at least two major publications in pdf
    • minimum 2 reference letters

    Project abstract

    Curvilinear structure networks are widespread in both nature and
    anthropogenic systems, ranging from angiography, earth and environment
    sciences, to biology and anthropogenic activities. Recovering the
    existence and architecture of these curvilinear networks is an essential
    and fundamental task in all the related domains. At present, there has
    been an explosion of image data documenting these curvilinear structure
    networks. Therefore, it is of upmost importance to develop numerical
    approaches that may assist us efficiently to automatically extract
    curvilinear networks from image data.

    In recent years, a bulk of works have been proposed to extract
    curvilinear networks. However, automated and high-quality curvilinear
    network extraction is still a challenging task nowadays. This is mainly
    due to the network shape complexity, low-contrast in images, and high
    annotation cost for training data. To address the problems aroused by
    these difficulties, this project intends to develop a novel,
    minimally-supervised curvilinear network extraction method by combining
    deep neural networks with active learning, where the deep neural
    networks are employed to automatically learn hierarchical and
    data-driven features of curvilinear networks, and the active learning is
    exploited to achieve high-quality extraction using as few annotations as
    possible. Furthermore, composite and hierarchical heuristic rules will
    be designed to constrain the geometry of curvilinear structures and
    guide the curvilinear graph growing.

    The proposed approach will be tested and validated on extraction of
    tectonic fractures and faults from a dense collection of satellite and
    aerial data and “ground truth” available at the Géoazur laboratory in
    the framework of the Faults_R_Gems project co-funded by the University
    Côte d’Azur (UCA) and the French National Research Agency (ANR). Then we
    intend to apply the new automatic extraction approaches to other
    scenarios, as road extraction in remote sensing images of the Nice
    region, and blood vessel extraction in available medical image databases.

    posted in Jobs offers - Call for projects read more
Internal error.

Oops! Looks like something went wrong!