Abstract

The desparsified lasso is a high-dimensional estimation method which provides uniformly valid inference. We extend this method to a time series setting under Near-Epoch Dependence (NED) assumptions allowing for non-Gaussian, serially correlated and heteroskedastic processes, where the number of regressors can possibly grow faster than the time dimension. We first derive an oracle inequality for the (regular) lasso, relaxing the commonly made exact sparsity assumption to a weaker alternative, which permits many small but non-zero parameters. The weak sparsity coupled with the NED assumption means this inequality can also be applied to the (inherently misspecified) nodewise regressions performed in the desparsified lasso. This allows us to establish the uniform asymptotic normality of the desparsified lasso under general conditions. Additionally, we show consistency of a long-run variance estimator, thus providing a complete set of tools for performing inference in high-dimensional linear time series models. Finally, we perform a simulation exercise to demonstrate the small sample properties of the desparsified lasso in common time series settings.
Original languageEnglish
PublisherarXiv.org at Cornell University Library
Publication statusPublished - 21 Jul 2020

Publication series

SeriesarXiv e-prints
Number2007.10952

Keywords

  • honesr inference
  • lasso
  • time series
  • High-dimensional data

Cite this

Adamek, R., Smeekes, S., & Wilms, I. (2020). Lasso Inference for High-Dimensional Time Series. arXiv.org at Cornell University Library. arXiv e-prints, No. 2007.10952 https://arxiv.org/abs/2007.10952