Poster
in
Workshop: NeurIPS 2024 Workshop: Machine Learning and the Physical Sciences
Dissipativity-Informed Learning for Chaotic Dynamical Systems with Attractor Characterization
Sunbochen Tang · Themis Sapsis · Navid Azizan
Accurate prediction for chaotic systems is challenging due to their intrinsic sensitivity to initial condition perturbations. Instead, recent advances have been focused on forecasting models that produce trajectories preserving invariant statistics over a long horizon. However, data-driven methods are prone to generate unbounded trajectories, resulting in invalid statistics evaluation. Despite the sensitive nature of chaos, many chaotic systems show dissipative behaviors, meaning that they will enter a bounded invariant set eventually. In this paper, we propose a novel neural network architecture that preserves dissipativity by leveraging control-theoretic stability notions and constructing a projection layer that ensures trajectory boundedness. Additionally, the trained network also learns a Lyapunov function that governs dissipativity, along with an outer estimate of the attractor. We demonstrate the capability of our model in producing bounded long-horizon forecasts and characterizing the attractor using a truncated Kuramoto-Sivashinsky system.