Find all the docs and tutorials of the version 0.2.3 in the read the docs website:
N.B.: This is still an alpha release! Please send me your feedback: I will polish the user interface, implement Hausdorff divergences, add support for meshes, images, volumes and clean the documentation over the summer of 2020.
The GeomLoss library provides efficient GPU implementations for:
Kernel norms (also known as Maximum Mean Discrepancies).
Hausdorff divergences, which are positive definite generalizations of the ICP loss, analogous to log-likelihoods of Gaussian Mixture Models.
Unbiased Sinkhorn divergences, which are cheap yet positive definite approximations of Optimal Transport (Wasserstein) costs.
These loss functions, defined between positive measures, are available through the custom PyTorch layers SamplesLoss, ImagesLoss and VolumesLoss which allow you to work with weighted point clouds (of any dimension), density maps and volumetric segmentation masks. Geometric losses come with three backends each:
A simple tensorized implementation, for small problems (< 5,000 samples).
A reference online implementation, with a linear (instead of quadratic) memory footprint, that can be used for finely sampled measures.
A very fast multiscale code, which uses an octree-like structure for large-scale problems in dimension <= 3.
GeomLoss is a simple interface for cutting-edge Optimal Transport algorithms. It provides: