Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2024 Workshop: Machine Learning and the Physical Sciences

Scalable nonlinear manifold reduced order model for dynamical systems

Ivan Zanardi · Alejandro N. Diaz · Seung Whan Chung · Marco Panesi · Youngsoo Choi


Abstract:

The domain decomposition (DD) nonlinear-manifold reduced-order model (NM-ROM) represents a computationally efficient method for integrating underlying physics principles into a neural network-based, data-driven approach. Compared to linear subspace methods, NM-ROMs offer superior expressivity and enhanced reconstruction capabilities, while DD enables cost-effective, parallel training of autoencoders by partitioning the domain into algebraic subdomains. In this work, we investigate the scalability of this approach by implementing a "bottom-up" strategy: training NM-ROMs on smaller domains and subsequently deploying them on larger, composable ones. The application of this method to the two-dimensional time-dependent Burgers' equation shows that extrapolating from smaller to larger domains is both stable and effective. This approach achieves an accuracy of 1% in relative error and provides a remarkable speedup of nearly 700 times.

Chat is not available.