High Dimensional Forecasting via Interpretable Vector Autoregression

William B. Nicholson*, Ines Wilms, Jacob Bien, David S. Matteson

*Corresponding author for this work

Research output: Working paper / PreprintPreprint

Abstract

Vector autoregression (VAR) is a fundamental tool for modeling multivariate time series. However, as the number of component series is increased, the VAR model becomes overparameterized. Several authors have addressed this issue by incorporating regularized approaches, such as the lasso in VAR estimation. Traditional approaches address overparameterization by selecting a low lag order, based on the assumption of short range dependence, assuming that a universal lag order applies to all components. Such an approach constrains the relationship between the components and impedes forecast performance. The lasso-based approaches perform much better in high-dimensional situations but do not incorporate the notion of lag order selection. We propose a new class of hierarchical lag structures (HLag) that embed the notion of lag selection into a convex regularizer. The key modeling tool is a group lasso with nested groups which guarantees that the sparsity pattern of lag coefficients honors the VAR's ordered structure. The proposed HLag framework offers three basic structures, which allow for varying levels of flexibility, with many possible generalizations. A simulation study demonstrates improved performance in forecasting and lag order selection over previous approaches, and macroeconomic, financial, and energy applications further highlight forecasting improvements as well as HLag's convenient, interpretable output.

Original languageEnglish
Pages1-52
Number of pages52
Volume21
DOIs
Publication statusPublished - 2020

Publication series

SeriesJournal of Machine Learning Research
ISSN1532-4435

Keywords

  • components
  • forecasting
  • group lasso
  • lasso
  • model selection
  • multivariate time series
  • number
  • rank
  • regression
  • regularization
  • regulatory networks
  • shrinkage
  • sparsity
  • variable selection
  • vector autoregression
  • RANK
  • MODEL SELECTION
  • LASSO
  • REGULARIZATION
  • REGRESSION
  • SPARSITY
  • NUMBER
  • COMPONENTS
  • SHRINKAGE
  • REGULATORY NETWORKS

Fingerprint

Dive into the research topics of 'High Dimensional Forecasting via Interpretable Vector Autoregression'. Together they form a unique fingerprint.

Cite this