Poster
in
Workshop: NeurIPS 2024 Workshop: Machine Learning and the Physical Sciences
Systematic Uncertainties and Data Complexity in Normalizing Flows
Sandip Roy · Yonatan Kahn · Jessie Shelton · Victoria Tiki
Normalizing flows are a powerful technique for inferring probability distributions from finite samples, a highly relevant task across the physical sciences. Using two toy examples from astrophysics, we investigate the interplay between two different sources of uncertainty in normalizing flow analyses: varying the draws from the training distribution (data variance) versus varying the network initialization (initialization variance). We find that for sufficiently large training sets, initialization variance dominates for "simple" distributions while data variance dominates for more "complex" distributions, as measured by the Kullback-Leibler divergence. This suggests that normalizing flows trained on real-world datasets may (fortunately) be robust against initialization choices.