Poster
in
Workshop: Optimal Transport and Machine Learning
Likelihood Training of Schrödinger Bridges using Forward-Backward SDEs Theory
Tianrong Chen · Guan-Horng Liu · Evangelos Theodorou
Schrödinger Bridges (SBs) is an optimal transport problem that has received increasing attentions in deep generative modeling for its mathematical flexibility compared to Scored-based Generative Models (SGMs). However, it remains unclear whether the optimization principle of SBs relates to modern training of deep generative models, which often relies on constructing parametrized log-likelihood objectives.This raises questions on the suitability of SB models as a principled alternative for generative applications. In this work, we present a novel computational framework for likelihood training of SB models grounded on Forward-Backward Stochastic Differential Equations Theory – a nonlinear stochastic optimal control principle that transforms the optimality condition of SBs into a set of SDEs. Crucially, these SDEs can be used to construct the likelihood objectives for SBs that, surprisingly, generalizes the ones for SGMs as special cases. This leads to a new optimization principle that inherits the same SB optimality yet without losing application of modern generative training techniques, and we show that the resulting training algorithm achieves encouraging results on generating realistic images on MNIST and CelebA.