Econometrica: Nov, 2006, Volume 74, Issue 6
Tests of Conditional Predictive Ability
https://doi.org/10.1111/j.1468-0262.2006.00718.x
p. 1545-1578
Raffaella Giacomini, Halbert White
We propose a framework for out‐of‐sample predictive ability testing and forecast selection designed for use in the realistic situation in which the forecasting model is possibly misspecified, due to unmodeled dynamics, unmodeled heterogeneity, incorrect functional form, or any combination of these. Relative to the existing literature (Diebold and Mariano (1995) and West (1996)), we introduce two main innovations: (i) We derive our tests in an environment where the finite sample properties of the estimators on which the forecasts may depend are preserved asymptotically. (ii) We accommodate evaluation objectives (can we predict which forecast will be more accurate at a future date?), which nest objectives (which forecast was more accurate on average?), that have been the sole focus of previous literature. As a result of (i), our tests have several advantages: they capture the effect of estimation uncertainty on relative forecast performance, they can handle forecasts based on both nested and nonnested models, they allow the forecasts to be produced by general estimation methods, and they are easy to compute. Although both unconditional and conditional approaches are informative, conditioning can help fine‐tune the forecast selection to current economic conditions. To this end, we propose a two‐step decision rule that uses current information to select the best forecast for the future date of interest. We illustrate the usefulness of our approach by comparing forecasts from leading parameter‐reduction methods for macroeconomic forecasting using a large number of predictors.