Skip to yearly menu bar Skip to main content


Poster
in
Workshop: AI4Mat-2024: NeurIPS 2024 Workshop on AI for Accelerated Materials Design

FB-HyDON: Parameter-Efficient Physics-Informed Operator Learning of Complex PDEs via Hypernetwork and Finite Basis Domain Decomposition

Milad Ramezankhani · Rishi Parekh · Anirudh Deodhar · Dagnachew Birru

Keywords: [ Operator learning ] [ domain decomposition ] [ hypernetwork ] [ physics informed ]


Abstract:

Deep operator networks (DeepONet) and neural operators have gained significant attention for their ability to map infinite-dimensional function spaces and perform zero-shot super-resolution. However, these models often require large datasets for effective training. While physics-informed operators offer a data-agnostic learning approach, they introduce additional training complexities and convergence issues, especially in highly nonlinear systems. To overcome these challenges, we introduce Finite-Basis Physics-Informed HyperDeepONet (FB-HyDON), an advanced operator architecture featuring intrinsic domain decomposition. By leveraging hypernetworks and finite-basis functions, FB-HyDON effectively mitigates the training limitations associated with existing physics-informed operator learning methods. We validated our approach on the high-frequency harmonic oscillator, Burgers' equation at different viscosity levels, and Allen Cahn equation demonstrating substantial improvements over other operator learning models.

Chat is not available.