Invited Talk
in
Workshop: Time Series in the Age of Large Models
Invited Talk by Qingsong Wen - LLM and Foundation Models for Time Series Analysis
Time series analysis is ubiquitous, serving as a cornerstone for extracting valuable insights across a myriad of real-world applications. Recent advancements in Large Language Models (LLMs) and Foundation Models (FMs) have fundamentally reshaped the paradigm of model design for time series analysis, significantly boosting performance in various downstream tasks. In this talk, I will first provide an up-to-date overview of this exciting area, highlighting how LLMs and FMs are being leveraged to address the unique challenges of time series data. Then, I will present our recent research on Time-LLM (ICLR'24), a pioneering approach that reprograms LLMs for time series forecasting, and Time-MoE (arXiv'24), the first work to scale time series foundation models up to 2.4 billion parameters. Additionally, I will discuss our related work, such as Time-MMD (NeurIPS'24), Time-FFM (NeurIPS'24), etc., highlighting their roles in advancing time series modeling. By consolidating the latest advancements in LLMs and FMs for time series analysis, this talk aims to illuminate the transformative potential of these models and identify promising avenues for future research exploration.