Skip to yearly menu bar Skip to main content


Poster

Multiple Physics Pretraining for Spatiotemporal Surrogate Models

Michael McCabe · Bruno Régaldo-Saint Blancard · Liam Parker · Ruben Ohana · Miles Cranmer · Alberto Bietti · Michael Eickenberg · Siavash Golkar · Geraud Krawezik · Francois Lanusse · Mariel Pettee · Tiberiu Tesileanu · Kyunghyun Cho · Shirley Ho

[ ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

We introduce multiple physics pretraining (MPP), an autoregressive task-agnostic pretraining approach for physical surrogate modeling of spatiotemporal systems with transformers. In MPP, rather than training one model on a specific physical system, we train a backbone model to predict the dynamics of multiple heterogeneous physical systems simultaneously in order to learn features that are broadly useful across systems and facilitate transfer. In order to learn effectively in this setting, we introduce a shared embedding and normalization strategy that projects the fields of multiple systems into a shared embedding space. We validate the efficacy of our approach on both pretraining and downstream tasks over a broad fluid mechanics-oriented benchmark. We show that a single MPP-pretrained transformer is able to match or outperform task-specific baselines on all pretraining sub-tasks without the need for finetuning. For downstream tasks, we demonstrate that finetuning MPP-trained models results in more accurate predictions across multiple time-steps on new physics or higher dimensional systems compared to training from scratch or finetuning pretrained video foundation models. We open-source our \href{https://anonymous.4open.science/r/anonymousmultiphysics_pretraining-99FB/}{code} and model weights trained at multiple scales for reproducibility.

Live content is unavailable. Log in and register to view live content