Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Time Series in the Age of Large Models

Reimagining Time Series Foundation Models: Metadata and State-Space Model Perspectives

Pengrui Quan · Ozan Mulayim · Liying Han · Dezhi Hong · Mario Berges · Mani Srivastava


Abstract:

The success of foundation models in natural language processing has sparked a growing interest in developing analogous models for time series (TS) analysis. These time series foundation models (TSFM), pre-trained on vast amounts of TS data, promise to achieve zero-shot and few-shot inference on unseen datasets. However, the intrinsic heterogeneity of TS data presents unique challenges: accurate inference often necessitates a deep understanding of the underlying data-generating process and the sensing apparatus, which cannot be readily inferred from the raw data alone. Furthermore, recent advances in state-space models raise the question of whether they may offer advantages over transformer-based architectures for TS analysis.This paper investigates these questions in two key areas: (a) the role of language-based metadata and timestamps as side-channels in improving TSFM performance and (b) the comparative effectiveness of state-space models (SSM) versus transformer models for TS forecasting. Our designed experiments show the superiority of SSM in TS analysis and demonstrate the advantages of incorporating the notion of real-world timestamps into TSFMs. Our model outperforms three existing TSFMs while using 6000 times fewer trainable parameters and 10 times less training data.

Chat is not available.