1 / 26

EC 827

EC 827. Module 4 Forecasting Multiple Variables from their own Histories. Multiple Variable AR Systems. Vector Autoregression (VAR) equations for several variables where each variable depends not only on its own history, but also the history of all the other variables.

temple
Télécharger la présentation

EC 827

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EC 827 Module 4 Forecasting Multiple Variables from their own Histories

  2. Multiple Variable AR Systems • Vector Autoregression (VAR) • equations for several variables where each variable depends not only on its own history, but also the history of all the other variables. • multiple variable extension of an AR model • In principle could specify multiple variable MA or multiple variable ARMA models • in practice such models are difficult to specify • typically low order VARs are adequate to approximate MA or ARMA processes

  3. VAR Process: A Two Variable Example

  4. Predictive Causality (Diebold p.303) • Says that information in the history of one variable can be used to improve upon the forecasts of a second variable compared to just forecasting from the history of the second variable. • Z has predictive causality for X if b1 not equal to zero. X has predictive causality for Z if c1 not equal to zero

  5. VAR Processes: Higher Order Systems • Only one lag on X and Z appears in each of the above equations. • systems with one lag are referred to as first order systems. • higher order systems involve more than one lag in at least one of the variables • Not necessary to limit the size of the forecasting problem to only two variables • data limitations preclude systems with very large number of variables (say > 10)

  6. How Long for the Lags? • Long enough to get rid of any significant autocorrelation in the residuals of each equation (otherwise there is information to improve on the forecasts • Not so long that the model is “over-parameterized” and there is a loss of forecasting efficiency • AIC and SIC again (MenuRATS VAR procudure will compute their logs and print)

  7. How Long for the Lags? II • One strategy: • start with a longer lag • check that autocorrelations of residuals are small (i.e. you’re not wasting information). • shorten the lag and re-estimate • do AIC and/or SIC increase or decrease? (smaller is better!) • check on the stability of the estimated coefficients as the lag length is shortened

  8. How Long for the Lags? III • Generally are not going to need a lot of lags for seasonally adjusted data • 3-4 lags for quarterly observations • 5-7 lags for monthly observations • For non-seasonally adjusted data, be careful about autocorrelations at seasonal frequencies • may need short continuous lag, then another lag at seasonal frequency.

  9. Leading Indicators: An Example

  10. Leading Indicator • When c1 = 0.0 then the history of the X variable does not influence the future values of the Z variable (no predictive causality of X for Z) • As long as b1 not equal to zero, the history of Z has predictive value for future outcomes of X • under these conditions we say that Z is a leading indicator of X. • good or bad leading indicator depends on the size of b1 and the variance of e1t

  11. Testing for Leading Indicators • Question of interest is whether all the coefficient on lagged values on one variable are zero in the regression in which some other variable is the dependent variable? • F test can be used to examine the hypothesis that multiple regression coefficients are jointly equal to zero.

  12. Leading Indicators • Until recently the U.S Department of Commerce published a leading indicator series • Recently “privatized” (or “outsourced”) to Conference Board (NY research operation) • not a single variable, but a weighted sum of 11 variables (a composite leading indicator) • alleged systematic autocorrelation between this composite variable and real output (real GDP)

  13. Leading Indicators • Commerce (or Conference Board) Composite Leading Indicator viewed as a forecasting device for future expansions or recessions. • autocorrelations are not 1.0; not a perfect forecasting device by any means • frequently generates false signals of recessions • accuracy somewhat improved (though not perfect by any means) by looking at average behavior over several months.

  14. Log of Industrial Production 1.00 0.75 0.50 0.25 0.00 -0.25 -0.50 -0.75 -1.00 1 3 5 7 9 11 13 15 17 19 Industrial Production Autocorrelations

  15. Log Differences of Industrial Production 1.00 0.75 0.50 0.25 0.00 -0.25 -0.50 -0.75 -1.00 1 3 5 7 9 11 13 15 17 19 Log Change in IP Autocorrelations

  16. Log Difference IP AR(1) Model Dependent Variable DQIP - Estimation by Least Squares Monthly Data From 47:02 To 94:12 R Bar **2 0.15 Standard Error of Estimate 0.00996 Durbin-Watson Statistic 2.07 Variable Coeff Std Error T-Stat ******************************************* 1. Constant 0.0018 0.0004 4.12 2. DQIP{1} 0.40 0.0385 10.27

  17. Log Difference IP AR(1)Residuals 1.00 0.75 0.50 0.25 0.00 -0.25 -0.50 -0.75 -1.00 1 3 5 7 9 11 13 15 17 19 Differenced IP AR(1) Residuals

  18. IP-Composite Leading Indicator Model Dependent Variable DQIP - Estimation by Least Squares Monthly Data From 47:02 To 94:12 R Bar **2 0.27 Standard Error of Estimate 0.0092941152 Durbin-Watson Statistic 1.967783 Variable Coeff Std Error T-Stat ******************************************** 1. Constant 0.0017 0.0004 4.19 2. DIND{1} 0.19 0.09 2.08 3. DIND{2} 0.47 0.09 5.18 4. DIND{3} 0.07 0.09 0.79 5. DIND{4} 0.25 0.09 2.92 6. DQIP{1} 0.20 0.04 4.66

  19. Forecasting from VAR Models • One period ahead forecasts: • multiply coefficients of model by most recently observed values of time series and add the terms up = forecast of next period value (tXt+1) • Multiple period ahead forecasts: • most recent data values are not available • for tXt+2 use predicted values, tXt+1 for Xt-1, Xt for Xt-2, etc. • for tXt+3 use predicted values, tXt+2 for Xt-1, tXt+1 for Xt-2, Xt for Xt-3, etc.

  20. Cointegration • Suppose that you have several variables that are generated by unit root processes • random walks with or without drift • Suppose that such variables are “tied together” - there are linear combinations of the variables that are stationary • such variables are said to be cointegrated

  21. Cointegration and Vector Error Correction Models (VECM) • Variables that are cointegrated can be represented by a special kind of VAR - A Vector Error Correction Model • Two Variable VECM:

  22. Cointegration and VECM’s • gXt-1 + hZt-1 is called the cointegrating vector (the linear combination of X and Z that is stationary) • f1 and f2 are called the error correction coefficients • if f1 and f2 are both equal to zero, then the VECM is just an ordinary VAR in first differences of X and Z

  23. Cointegration and VECM’s • Advantage of VECM specification • VAR in differences ignores the information that the levels of the variables cannot wander aimlessly, but are tied together in the long run. • may be able to improve forecasts over intermediate to long-run over just VAR in differences.

  24. Judging Forecasts I Prediction-Realization Diagram 10 Turning Point Errors 5 Predicted Change 0 -5 Line of Perfect Forecast -10 -10 -5 0 5 10 Actual Change

  25. Judging Forecasts III • Mean Squared Error (MSE): • obviously, smaller is better, again...

  26. Judging Forecasts IV • Theil Inequality Proportions (add to 1.0) • UM = Bias Proportion • large values are bad; indicates systematic differences in actual and average changes • US = Variance Proportion • large values indicate unequal variances of actual and predicted changes • UC = Covariance Proportion • zero = perfect correlation between actual and predicted changes

More Related