Poster
in
Workshop: NeurIPS 2024 Workshop: Machine Learning and the Physical Sciences
Path-minimizing Latent ODEs as Inference Models
Matt Sampson · Peter Melchior
Abstract:
Latent ODE models provide flexible descriptions of dynamic systems, but they can struggle with extrapolation and predicting complicated non-linear dynamics.The Latent ODE approach implicitly relies on encoders to identify unknown system parameters and initial conditions, whereas the evaluation times are known and directly provided to the ODE solver.This dichotomy can be exploited by encouraging \emph{time-independent} latent representations. By replacing the common variational penalty in latent space with an $\ell_2$ penalty on the path length of each system, the models learn data representations that can easily be distinguished from those of systems with different configurations.We demonstrate superior results for simulation-based inference of the Lotka-Volterra parameters and initial conditions by using the latents as data summaries for a conditional normalizing flow.
Chat is not available.