Mohammad Azam Khan

Postdoc


Curriculum vitae




WaveBound: Dynamic Error Bounds for Stable Time Series Forecasting


Journal article


Youngin Cho, Daejin Kim, Dongmin Kim, Mohammad Azam Khan, J. Choo
Neural Information Processing Systems (NeurIPS), 2022

Semantic Scholar ArXiv DBLP DOI
Cite

Cite

APA   Click to copy
Cho, Y., Kim, D., Kim, D., Khan, M. A., & Choo, J. (2022). WaveBound: Dynamic Error Bounds for Stable Time Series Forecasting. Neural Information Processing Systems (NeurIPS).


Chicago/Turabian   Click to copy
Cho, Youngin, Daejin Kim, Dongmin Kim, Mohammad Azam Khan, and J. Choo. “WaveBound: Dynamic Error Bounds for Stable Time Series Forecasting.” Neural Information Processing Systems (NeurIPS) (2022).


MLA   Click to copy
Cho, Youngin, et al. “WaveBound: Dynamic Error Bounds for Stable Time Series Forecasting.” Neural Information Processing Systems (NeurIPS), 2022.


BibTeX   Click to copy

@article{youngin2022a,
  title = {WaveBound: Dynamic Error Bounds for Stable Time Series Forecasting},
  year = {2022},
  journal = {Neural Information Processing Systems (NeurIPS)},
  author = {Cho, Youngin and Kim, Daejin and Kim, Dongmin and Khan, Mohammad Azam and Choo, J.}
}

Abstract

Time series forecasting has become a critical task due to its high practicality in real-world applications such as traffic, energy consumption, economics and finance, and disease analysis. Recent deep-learning-based approaches have shown remarkable success in time series forecasting. Nonetheless, due to the dynamics of time series data, deep networks still suffer from unstable training and overfitting. Inconsistent patterns appearing in real-world data lead the model to be biased to a particular pattern, thus limiting the generalization. In this work, we introduce the dynamic error bounds on training loss to address the overfitting issue in time series forecasting. Consequently, we propose a regularization method called WaveBound which estimates the adequate error bounds of training loss for each time step and feature at each iteration. By allowing the model to focus less on unpredictable data, WaveBound stabilizes the training process, thus significantly improving generalization. With the extensive experiments, we show that WaveBound consistently improves upon the existing models in large margins, including the state-of-the-art model.


Share



Follow this website


You need to create an Owlstown account to follow this website.


Sign up

Already an Owlstown member?

Log in