Poster
in
Workshop: Foundation Models for Science: Progress, Opportunities, and Challenges
Scale-consistent learning with neural operators
Zongyi Li · Samuel Lanthaler · Catherine Deng · Yixuan Wang · Kamyar Azizzadenesheli · Animashree Anandkumar
Keywords: [ Operator learning ] [ Physical simulation ] [ Navier-Stokes equation ] [ Scale symmetry ]
Data-driven models have emerged as a promising approach for solving partial differential equations (PDEs) in science and engineering. Previous machine learning (ML) models typically cover only a narrow distribution of PDE problems; for example, a trained ML model for the Navier-Stokes equations usually works only for a fixed Reynolds number and domain size. To overcome these limitations, we propose a data augmentation scheme based on scale-consistency properties of PDEs and design a scale-informed neural operator that can model a wide range of scales. Our formulation (i) leverages the fact that many PDEs possess a scale consistency under rescaling of the spatial domain, and (ii) is based on the discretization-convergent property of neural operators, which allows them to be applied across arbitrary resolutions. Our experiments on the 2D Darcy Flow, Helmholtz equation, and Navier-Stokes equations show that the proposed scale-consistency loss helps the scale-informed neural operator model generalize to Reynolds numbers ranging from 250 to 10000. This approach has the potential to significantly improve the efficiency and generalizability of data-driven PDE solvers in various scientific and engineering applications.