Dimension Reduction for High-Dimensional Vector Autoregressive Models

G. Cubadda*, A. Hecq

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

This article aims to decompose a large dimensional vector autoregressive (VAR) model into two components, the first one being generated by a small-scale VAR and the second one being a white noise. Hence, a reduced number of common components generates the entire dynamics of the large system through a VAR structure. This modelling, which we label as the dimension-reducible VAR, extends the common feature approach to high-dimensional systems, and it differs from the dynamic factor model in which the idiosyncratic component can also embed a dynamic pattern. We show the conditions under which this decomposition exists. We provide statistical tools to detect its presence in the data and to estimate the parameters of the underlying small-scale VAR model. Based on our methodology, we propose a novel approach to identify the shock that is responsible for most of the common variability at the business cycle frequencies. We evaluate the practical value of the proposed methods by simulations as well as by an empirical application to a large set of US economic variables.
Original languageEnglish
Pages (from-to)1123-1152
Number of pages30
JournalOxford Bulletin of Economics and Statistics
Volume84
Issue number5
Early online date12 Jun 2022
DOIs
Publication statusPublished - Oct 2022

Keywords

  • COMMON CYCLICAL FEATURES
  • COINTEGRATION RANK
  • INFERENCE
  • MACROECONOMICS
  • SELECTION
  • NUMBER

Fingerprint

Dive into the research topics of 'Dimension Reduction for High-Dimensional Vector Autoregressive Models'. Together they form a unique fingerprint.

Cite this