Expo Talk Panel
West Exhibition Hall C, B3

While Foundation Models (FMs) have revolutionized AI for language and vision, they often fall short when it comes to handling sensor and numerical time-series data, which are crucial in many industries. At IBM Research, our team is dedicated to advancing Time Series foundation models, making significant contributions with influential papers presented at top AI conferences with over 1700 citations and numerous open-source contributions establishing the Granite Time Series Family on Hugging Face. In 2024, we introduced Granite-TimeSeries-Tiny Time Mixer (TTM), the first lightweight foundation model tailored for time-series forecasting. With just 1M parameters, TTM redefines efficiency, speed, and accuracy in zero-shot and few-shot forecasting, outperforming existing SOTAs demanding hundreds of millions to billions of parameters. Since its launch, TTM has amassed over one million downloads in the HuggingFace platform, generating widespread excitement within the time-series community. It delivers up to 40% better performance in zero/few-shot forecasting, all while drastically reducing computational demands. TTM’s lightweight architecture enables it to run efficiently on CPU machines too, driving broader adoption in resource-constrained environments. In this session, we will explore our latest advancements in Granite Time Series Models and their applications in forecasting, imputation, anomaly detection, and many other downstream tasks across various industries.

Chat is not available.