View a PDF of the paper titled TS-RAG: Retrieval-Augmented Generation based Time Series Foundation Models are Stronger Zero-Shot Forecaster, by Kanghui Ning and 9 other authors
View PDF
HTML (experimental)
Abstract:Large Language Models (LLMs) and Foundation Models (FMs) have recently become prevalent for time series forecasting tasks. While fine-tuning LLMs enables domain adaptation, they often struggle to generalize across diverse and unseen datasets. Moreover, existing Time Series Foundation Models (TSFMs) still face challenges in handling non-stationary dynamics and distribution shifts, largely due to the lack of effective mechanisms for adaptation. To this end, we present TS-RAG, a retrieval-augmented generation framework for time series forecasting that enhances the generalization and interpretability of TSFMs. Specifically, TS-RAG leverages pre-trained time series encoders to retrieve semantically relevant segments from a dedicated knowledge base, enriching the contextual representation of the input query. Furthermore, we propose an Adaptive Retrieval Mixer (ARM) module that dynamically fuses the retrieved patterns with the TSFM’s internal representation, improving forecasting accuracy without requiring task-specific fine-tuning. Thorough empirical studies on seven public benchmark datasets demonstrate that TS-RAG achieves state-of-the-art zero-shot forecasting performance, outperforming the existing TSFMs by up to 6.84% across diverse domains while also providing desirable interpretability. Our code and data are available at: this https URL
Submission history
From: Kanghui Ning [view email]
[v1]
Thu, 6 Mar 2025 16:48:48 UTC (1,069 KB)
[v2]
Tue, 1 Apr 2025 21:23:59 UTC (1,069 KB)
[v3]
Tue, 27 May 2025 20:50:23 UTC (2,655 KB)
[v4]
Tue, 25 Nov 2025 19:02:14 UTC (1,071 KB)


