Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Time Series in the Age of Large Models

LLMForecaster: Improving Seasonal Event Forecasts with Unstructured Textual Data

Hanyu Zhang · Chuck Arvin · Dmitry Efimov · Michael Mahoney · Dominique Perrault-Joncas · Shankar Ramasubramanian · Andrew Wilson · Malcolm Wolff


Abstract:

Modern time-series forecasting models often fail to make full use of rich unstructured information about the time series themselves. This lack of proper conditioning can lead to “obvious" model failures; for example, models may be unaware of the details of a particular product, and hence fail to anticipate seasonal surges in customer demand in the lead up to major exogenous events like holidays for clearly relevant products. To address this shortcoming, this paper introduces a novel forecast post-processor — which we call LLMForecaster — that fine-tunes large language models (LLMs) to incorporate unstructured semantic and contextual information and historical data to improve the forecasts from an existing demand forecasting pipeline. In an industry-scale retail application, we demonstrate that our technique yields statistically significantly forecast improvements across several sets of products subject to holiday-driven demand surges.

Chat is not available.