ArXiv TLDR

Auto-Configured Networks for Multi-Scale Multi-Output Time-Series Forecasting

🐦 Tweet
2604.07610

Yumeng Zha, Shengxiang Yang, Xianpeng Wang

cs.LGcs.NE

TLDR

This paper introduces an auto-configuration framework for multi-scale, multi-output time-series forecasting, balancing prediction error and model complexity.

Key contributions

  • Proposes an auto-configuration framework for multi-scale, multi-output time-series forecasting.
  • Develops a Multi-Scale Bi-Branch CNN (MS-BCNN) to capture both local fluctuations and long-term trends.
  • Unifies alignment operators, architecture, and hyperparameters into a hierarchical-conditional configuration space.
  • Uses PHMOEA to approximate the error-complexity Pareto frontier within a limited computational budget.

Why it matters

Industrial forecasting requires models that balance accuracy and complexity, a challenge current fixed-design methods struggle with. This framework automates the co-design process, offering deployable, Pareto-optimal models. It provides significant improvements and flexible deployment choices for real-world applications.

Original Abstract

Industrial forecasting often involves multi-source asynchronous signals and multi-output targets, while deployment requires explicit trade-offs between prediction error and model complexity. Current practices typically fix alignment strategies or network designs, making it difficult to systematically co-design preprocessing, architecture, and hyperparameters in budget-limited training-based evaluations. To address this issue, we propose an auto-configuration framework that outputs a deployable Pareto set of forecasting models balancing error and complexity. At the model level, a Multi-Scale Bi-Branch Convolutional Neural Network (MS--BCNN) is developed, where short- and long-kernel branches capture local fluctuations and long-term trends, respectively, for multi-output regression. At the search level, we unify alignment operators, architectural choices, and training hyperparameters into a hierarchical-conditional mixed configuration space, and apply Player-based Hybrid Multi-Objective Evolutionary Algorithm (PHMOEA) to approximate the error--complexity Pareto frontier within a limited computational budget. Experiments on hierarchical synthetic benchmarks and a real-world sintering dataset demonstrate that our framework outperforms competitive baselines under the same budget and offers flexible deployment choices.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.