Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Time Series in the Age of Large Models

Generalized Prompt Tuning: How to Use a Frozen Pre-Trained Univariate Time Series Foundation Model for Multivariate Time Series Prediction

Mingzhu Liu · Angela Chen · George H Chen


Abstract:

Time series foundation models are pre-trained on large datasets and are able to achieve state-of-the-art performance in diverse tasks. We observe that the majority of foundation models perform channel independence, where cross-channel correlations are ignored. In this study, we propose a prompt-tuning-inspired technique to perform channel-mixing for existing univariate time series pre-trained models, and compare the performance of different fine-tuning methods on time series foundation models.

Chat is not available.