Poster
in
Workshop: NeurIPS 2024 Workshop: Machine Learning and the Physical Sciences
Taylor Mode Neural Operators: Enhancing Computational Efficiency in Physics-Informed Neural Operators
Anas Jnini · Flavio Vella
In this paper, we propose a novel application of Taylor-mode Automatic Differentiation (AD) to efficiently compute high-order derivatives in physics-informed neural operators (PINOs). Traditional approaches to automatic differentiation, particularly reverse-mode AD, suffer from high memory costs and computational inefficiencies, especially when dealing with high-order Partial Differential Equations (PDEs) and large-scale neural networks. Our method leverages Taylor-mode AD to forward-propagate Taylor series coefficients, enabling the efficient computation of high-order derivatives. We demonstrate our approach on two prominent neural operator architectures: DeepONets and Fourier Neural Operators (FNOs). Results indicate an order-of-magnitude speed-up over state-of-the-art methods for DeepONets and an eightfold acceleration for FNOs.