Spotlight
in
Workshop: The Symbiosis of Deep Learning and Differential Equations -- III
ELeGANt: An Euler-Lagrange Analysis of Wasserstein Generative Adversarial Networks
Siddarth Asokan · Chandra Seelamantula
Keywords: [ Fourier series ] [ generative adversarial networks ] [ Optimal GAN discriminator ] [ Poisson PDE ] [ Euler-Lagrange condition ]
We consider Wasserstein generative adversarial networks (WGAN) with a gradient-norm penalty and analyze the underlying {\it functional} optimization problem within a variational setting. The optimal discriminator in this setting is the solution to a Poisson differential equation, and can be obtained in closed form without having to train a neural network. We illustrate this by employing a Fourier-series approximation to solve the Poisson differential equation. Experimental results based on synthesized low-dimensional Gaussian data demonstrate superior convergence behavior of the proposed approach in comparison with the baseline WGAN variants that employ weight-clipping, gradient or Lipschitz penalties on the discriminator. Further, within this setting, the optimal Lagrange multiplier can be computed in closed-form, and serves as a proxy for measuring GAN generator convergence. This work is an extended abstract, summarizing Asokan and Seelamantula (2023).