Poster
in
Workshop: D3S3: Data-driven and Differentiable Simulations, Surrogates, and Solvers
SepONet: Efficient Large-Scale Physics-Informed Operator Learning
Xinling Yu · Sean Hooten · Ziyue Liu · Yequan Zhao · Marco Fiorentino · Thomas Van Vaerenbergh · Zheng Zhang
Keywords: [ scientific machine learning ] [ operator learning ] [ physics-informed neural network ] [ separation of variables ] [ Partial differential equations ] [ deep operator network (DeepONet) ]
We introduce Separable Operator Networks (SepONet), a novel framework that significantly enhances the efficiency of physics-informed operator learning. SepONet uses independent trunk networks to learn basis functions separately for different coordinate axes, enabling faster and more memory-efficient training via forward-mode automatic differentiation. We provide a universal approximation theorem for SepONet proving that it generalizes to arbitrary operator learning problems, and then validate its performance through comprehensive benchmarking against physics-informed DeepONet. Our results demonstrate SepONet's superior performance across various nonlinear and inseparable PDEs, with SepONet's advantages increasing with problem complexity, dimension, and scale. Open source code is available at https://github.com/HewlettPackard/separable-operator-networks.