A Justification of Conditional Confidence Intervals

Eric Beutner, Alexander Heinemann, Stephan Smeekes

Research output: Working paper / PreprintWorking paper

614 Downloads (Pure)

Abstract

To quantify uncertainty around point estimates of conditional objects such as conditional means or variances, parameter uncertainty has to be taken into account. Attempts to incorporate parameter uncertainty are typically based on the unrealistic assumption of observing two independent processes, where one is used for parameter estimation, and the other for conditioning upon. Such unrealistic foundation raises the question whether these intervals are theoretically justified in a realistic setting. This paper presents an asymptotic justification for this type of intervals that does not require such an unrealistic assumption, but relies on a sample-split approach instead. By showing that our sample-split intervals coincide asymptotically with the standard intervals, we provide a novel, and realistic, justification for confidence intervals of conditional objects. The analysis is carried out for a general class of Markov chains nesting various time series models.
Original languageEnglish
PublisherMaastricht University, Graduate School of Business and Economics
DOIs
Publication statusPublished - 10 Oct 2017

Publication series

SeriesGSBE Research Memoranda
Number023

JEL classifications

  • c53 - "Forecasting and Prediction Methods; Simulation Methods "
  • c22 - "Single Equation Models; Single Variables: Time-Series Models; Dynamic Quantile Regressions; Dynamic Treatment Effect Models"
  • c32 - "Multiple or Simultaneous Equation Models: Time-Series Models; Dynamic Quantile Regressions; Dynamic Treatment Effect Models"
  • g17 - Financial Forecasting and Simulation

Keywords

  • Conditional confidence intervals
  • Parameter Uncertainty
  • Markov chain
  • Sample-splitting
  • Prediction
  • Merging

Cite this