ArXiv TLDR

Temporal Patch Shuffle (TPS): Leveraging Patch-Level Shuffling to Boost Generalization and Robustness in Time Series Forecasting

🐦 Tweet
2604.09067

Jafar Bakhshaliyev, Johannes Burchert, Niels Landwehr, Lars Schmidt-Thieme

cs.LG

TLDR

Temporal Patch Shuffle (TPS) is a novel data augmentation method for time series forecasting that boosts generalization and robustness.

Key contributions

  • Proposes Temporal Patch Shuffle (TPS), a model-agnostic data augmentation for time series forecasting.
  • TPS extracts overlapping temporal patches, shuffles a subset, and reconstructs sequences to enhance diversity.
  • Preserves local temporal structure while increasing sample diversity, crucial for forecasting tasks.
  • Achieves consistent performance improvements across 9 long-term and 4 short-term forecasting datasets.

Why it matters

This paper addresses the scarcity of data augmentation methods for time series forecasting by introducing TPS. It offers a simple, model-agnostic way to boost generalization and robustness. Its consistent performance gains across diverse models and datasets make it a valuable tool for practitioners.

Original Abstract

Data augmentation is a crucial technique for improving model generalization and robustness, particularly in deep learning models where training data is limited. Although many augmentation methods have been developed for time series classification, most are not directly applicable to time series forecasting due to the need to preserve temporal coherence. In this work, we propose Temporal Patch Shuffle (TPS), a simple and model-agnostic data augmentation method for forecasting that extracts overlapping temporal patches, selectively shuffles a subset of patches using variance-based ordering as a conservative heuristic, and reconstructs the sequence by averaging overlapping regions. This design increases sample diversity while preserving forecast-consistent local temporal structure. We extensively evaluate TPS across nine long-term forecasting datasets using five recent model families (TSMixer, DLinear, PatchTST, TiDE, and LightTS), and across four short-term forecasting datasets using PatchTST, observing consistent performance improvements. Comprehensive ablation studies further demonstrate the effectiveness, robustness, and design rationale of the proposed method.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.