PulseAugur
LIVE 07:37:38
research · [12 sources] ·
0
research

New AI methods enhance time series forecasting accuracy and interpretability

Researchers have introduced several new methods for time-series forecasting, aiming to improve accuracy and generalization. MeLISA, a latent-free autoregressive model, enhances rollout efficiency and long-horizon statistical accuracy. Temporal Functional Circuits leverage Kolmogorov-Arnold Networks (KANs) to provide faithful, temporally grounded explanations for forecasts. Dynamic Pattern Recalibration (DPR) offers a backbone-agnostic mechanism for token-level recalibration to adapt to changing local dynamics. Additionally, AROpt proposes a novel training method that enforces error growth heuristics for more reliable long-term predictions, and TimeRFT uses reinforcement learning to fine-tune Time Series Foundation Models for better generalization. AI

Summary written by gemini-2.5-flash-lite from 12 sources. How we write summaries →

IMPACT These advancements in time-series forecasting could improve predictive accuracy and interpretability across various domains, from finance to energy operations.

RANK_REASON Multiple research papers published on arXiv introduce novel methods and architectures for time-series forecasting.

Read on arXiv cs.LG →

COVERAGE [12]

  1. arXiv cs.LG TIER_1 · Lars Schmidt-Thieme ·

    NPMixer: Hierarchical Neighboring Patch Mixing for Time Series Forecasting

    Multivariate time series forecasting remains a challenge due to the complexity of local temporal dynamics and global dependencies across multiple variables. In this paper, we propose \textbf{N}eighboring \textbf{P}atching \textbf{Mixer} (\textbf{NPMixer}), a hierarchical architec…

  2. Hugging Face Daily Papers TIER_1 ·

    NPMixer: Hierarchical Neighboring Patch Mixing for Time Series Forecasting

    Multivariate time series forecasting remains a challenge due to the complexity of local temporal dynamics and global dependencies across multiple variables. In this paper, we propose \textbf{N}eighboring \textbf{P}atching \textbf{Mixer} (\textbf{NPMixer}), a hierarchical architec…

  3. arXiv cs.LG TIER_1 · Tianyue Yang, Xiao Xue ·

    Towards Scalable One-Step Generative Modeling for Autoregressive Dynamical System Forecasting

    arXiv:2605.05540v1 Announce Type: new Abstract: Fast surrogate modeling for high-dimensional physical dynamics requires more than low short-term error: useful models must roll out efficiently while preserving the statistical structure of long trajectories. Neural operators provid…

  4. arXiv cs.LG TIER_1 · Naveen Mysore ·

    Temporal Functional Circuits: From Spline Plots to Faithful Explanations in KAN Forecasting

    arXiv:2605.05685v1 Announce Type: new Abstract: Unlike MLPs, Kolmogorov-Arnold Networks (KANs) expose explicit learnable edge functions on every connection, enabling mechanistic explanation in time-series forecasting. This paper introduces Temporal Functional Circuits, a framewor…

  5. arXiv cs.LG TIER_1 · Siru Zhong, Zhao Meng, Haohuan Fu, Haoyang Li, Qingsong Wen, Yuxuan Liang ·

    Perceive, Route and Modulate: Dynamic Pattern Recalibration for Time Series Forecasting

    arXiv:2605.06310v1 Announce Type: new Abstract: Local temporal patterns in real-world time series continuously shift, rendering globally shared transformations suboptimal. Current deep forecasting models, despite their scale and complexity, rely on fixed weight matrices applied u…

  6. arXiv cs.LG TIER_1 · Zheng Li, Jerry Cheng, Huanying Gu ·

    AROpt: An Optimization Method for Autoregressive Time Series Forecasting

    arXiv:2602.02288v2 Announce Type: replace Abstract: Current time-series forecasting models are primarily based on transformer-style neural networks. These models achieve long-term forecasting mainly by scaling up the model size rather than through genuinely autoregressive (AR) ro…

  7. arXiv cs.AI TIER_1 · Christine P. Lee, Min Kyung Lee, Bilge Mutlu ·

    Making the Invisible Visible: Understanding the Mismatch Between Organizational Goals and Worker Experiences in AI Adoption

    arXiv:2605.03078v1 Announce Type: new Abstract: While AI is often introduced into organizations to drive innovation and efficiency, many adoption efforts fail as workers resist and struggle to integrate these systems. These failures point to a deeper issue: workers, the very peop…

  8. arXiv cs.LG TIER_1 · Valery Manokhin ·

    Training-Free Probabilistic Time-Series Forecasting with Conformal Seasonal Pools

    arXiv:2605.03789v1 Announce Type: cross Abstract: We propose Conformal Seasonal Pools (CSP), a training-free probabilistic time-series forecaster that mixes same-season empirical draws with signed residual draws around a seasonal naive forecast. In an audited rolling-origin bench…

  9. arXiv cs.LG TIER_1 · Valery Manokhin ·

    Training-Free Probabilistic Time-Series Forecasting with Conformal Seasonal Pools

    We propose Conformal Seasonal Pools (CSP), a training-free probabilistic time-series forecaster that mixes same-season empirical draws with signed residual draws around a seasonal naive forecast. In an audited rolling-origin benchmark on the six time-series datasets where DeepNPT…

  10. arXiv cs.LG TIER_1 · Alexis-Raja Brachet, Pierre-Yves Richard, C\'eline Hudelot ·

    Time-series forecasting through the lens of dynamics

    arXiv:2507.15774v2 Announce Type: replace Abstract: While deep learning is facing an homogenization across modalities led by Transformers, they are still challenged by shallow linear models in the time-series forecasting task. Our hypothesis is that models should learn a direct l…

  11. arXiv stat.ML TIER_1 · Naveen Mysore ·

    Temporal Functional Circuits: From Spline Plots to Faithful Explanations in KAN Forecasting

    Unlike MLPs, Kolmogorov-Arnold Networks (KANs) expose explicit learnable edge functions on every connection, enabling mechanistic explanation in time-series forecasting. This paper introduces Temporal Functional Circuits, a framework that transforms KAN edge functions from latent…

  12. arXiv cs.CV TIER_1 · Siyang Li, Yize Chen, Zijie Zhu, Yuxin Pan, Yan Guo, Ming Huang, Hui Xiong ·

    TimeRFT: Stimulating Generalizable Time Series Forecasting for TSFMs via Reinforcement Finetuning

    arXiv:2605.00015v1 Announce Type: cross Abstract: Time Series Foundation Models (TSFMs) advance generalization and data efficiency in time series forecasting by unified large-scale pretraining. But TSFMs remain lacking when adapting to specific downstream forecasting tasks for tw…