EXTERNAL: Using geometry to form identifiable latent variable models and Isometric Gaussian Process Latent Variable Model
Søren Hauberg & Martin Jørgensen
Thursday 3rd December, 2020 12:00-13:30 Zoom
11:30-12:00: Isometric Gaussian Process Latent Variable Model - Martin Jørgensen
I present generative unsupervised model where the latent variable respects both the distances and the topology of the modeled data. The model leverages the Riemannian geometry of the generated manifold to endow the latent space with a well-defined stochastic distance measure, which is modeled as Nakagami distributions. These stochastic distances are sought to be as similar as possible to observed distances along a neighborhood graph through a censoring process. The model is inferred by variational inference. I demonstrate how the model can encode invariances in the learned manifolds.
12:00-13:00 Using geometry to form identifiable latent variable models - Prof Søren Hauberg
Generative models learn a compressed representation of data that is often used for downstream tasks such as interpretation, visualization and prediction via transfer learning. Unfortunately, the learned representations are generally not statistically identifiable, leading to a high risk of arbitrariness in the downstream tasks. We propose to use differential geometry to construct representations that are invariant to reparametrizations, thereby solving the bulk of the identifiability problem. We demonstrate that the approach is deeply tied to the uncertainty of the representation and that practical applications require high-quality uncertainty quantification. With the identifiability problem solved, we show how to construct better priors for generative models, and that the identifiable representations reveal signals in the data that were otherwise hidden.