1 / 83

STAT 497 LECTURE NOTES 4

STAT 497 LECTURE NOTES 4. MODEL INDETIFICATION AND NON-STATIONARY TIME SERIES MODELS. MODEL IDENTIFICATION. We have learned a large class of linear parametric models for stationary time series processes.

parry
Télécharger la présentation

STAT 497 LECTURE NOTES 4

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. STAT 497LECTURE NOTES 4 MODEL INDETIFICATION AND NON-STATIONARY TIME SERIES MODELS

  2. MODEL IDENTIFICATION • We have learned a large class of linear parametric models for stationary time series processes. • Now, the question is how we can find out the best suitable model for a given observed series. How to choose the appropriate model (on order of p and q).

  3. MODEL IDENTIFICATION • ACF and PACF show specific properties for specific models. Hence, we can use them as a criteria to identify the suitable model. • Using the patterns of sample ACF and sample PACF, we can identify the model.

  4. MODEL SELECTION THROUGH CRITERIA • Besides sACF and sPACF plots, we have also other tools for model identification. • With messy real data, sACF and sPACF plots become complicated and harder to interpret. • Don’t forget to choose the best model with as few parameters as possible. • It will be seen that many different models can fit to the same data so that we should choose the most appropriate (with less parameters) one and the information criteria will help us to decide this.

  5. MODEL SELECTION THROUGH CRITERIA • The three well-known information criteria are • Akaike’s information criterion (AIC) (Akaike, 1974) • Schwarz’s Bayesian Criterion (SBC) (Schwarz, 1978). Also known as Bayesian Information Criterion (BIC) • Hannan-Quinn Criteria (HQIC) (Hannan&Quinn, 1979)

  6. AIC • Assume that a statistical model of M parameters is fitted to data • For the ARMA model and n observations, the log-likelihood function

  7. AIC • Then, the maximized log-likelihood is Choose model (or the value of M) with minimum AIC.

  8. SBC • The Bayesian information criterion (BIC) or Schwarz Criterion (also SBC, SBIC) is a criterion for model selection among a class of parametric models with different numbers of parameters. • When estimating model parameters using maximum likelihood estimation, it is possible to increase the likelihood by adding additional parameters, which may result in overfitting. The BIC resolves this problem by introducing a penalty term for the number of parameters in the model.

  9. SBC • In SBC, the penalty for additional parameters is stronger than that of the AIC. • It has the most superior large sample properties. • It is consistent, unbiased and sufficient.

  10. HQIC • The Hannan-Quinn information criterion (HQIC) is an alternative to AIC and SBC. • It can be shown [see Hannan (1980)] that in the case of common roots in the AR and MA polynomials, the Hannan-Quinn and Schwarz criteria still select the correct orders p and qconsistently.

  11. THE INVERSE AUTOCORRELATION FUNCTION • The sample inverse autocorrelation function (SIACF) plays much the same role in ARIMA modeling as the sample partial autocorrelation function (SPACF), but it generally indicates subset and seasonal autoregressive models better than the SPACF.

  12. THE INVERSE AUTOCORRELATION FUNCTION • Additionally, the SIACF can be useful for detecting over-differencing. If the data come from a nonstationary or nearly nonstationary model, the SIACF has the characteristics of a noninvertible moving-average. Likewise, if the data come from a model with a noninvertible moving average, then the SIACF has nonstationary characteristics and therefore decays slowly. In particular, if the data have been over-differenced, the SIACF looks like a SACF from a nonstationary process

  13. THE INVERSE AUTOCORRELATION FUNCTION • Let Yt be generated by the ARMA(p, q) process • If (B) is invertible, then the model is also a valid ARMA(q, p) model. This model is sometimes referred to as the dual model. The autocorrelation function (ACF) of this dual model is called the inverse autocorrelation function (IACF) of the original model.

  14. THE INVERSE AUTOCORRELATION FUNCTION • Notice that if the original model is a pure autoregressive model, then the IACF is an ACF that corresponds to a pure moving-average model. Thus, it cuts off sharply when the lag is greater than p; this behavior is similar to the behavior of the partial autocorrelation function (PACF). • Under certain conditions, the sampling distribution of the SIACF can be approximated by the sampling distribution of the SACF of the dual model (Bhansali, 1980). In the plots generated by ARIMA, the confidence limit marks (.) are located at 2n1/2. These limits bound an approximate 95% confidence interval for the hypothesis that the data are from a white noise process.

  15. EXAMPLE USING SIMULATED SERIES 1 • Simulated 100 data from AR(1) where =0.5. • SAS output Autocorrelations Lag Covariance Correlation -1 9 8 7 6 5 4 3 2 1 0 1 2 3 4 5 6 7 8 9 1 Std Error 0 1.498817 1.00000 | |********************| 0 1 0.846806 0.56498 | . |*********** | 0.100000 2 0.333838 0.22273 | . |****. | 0.128000 3 0.123482 0.08239 | . |** . | 0.131819 4 0.039922 0.02664 | . |* . | 0.132333 5 -0.110372 -.07364 | . *| . | 0.132387 6 -0.162723 -.10857 | . **| . | 0.132796 7 -0.301279 -.20101 | .****| . | 0.133680 8 -0.405986 -.27087 | *****| . | 0.136670 9 -0.318727 -.21265 | . ****| . | 0.141937 10 -0.178869 -.11934 | . **| . | 0.145088 11 -0.162342 -.10831 | . **| . | 0.146066 12 -0.180087 -.12015 | . **| . | 0.146867 13 -0.132600 -.08847 | . **| . | 0.147847 14 0.026849 0.01791 | . | . | 0.148375 15 0.175556 0.11713 | . |** . | 0.148397

  16. Inverse Autocorrelations Lag Correlation -1 9 8 7 6 5 4 3 2 1 0 1 2 3 4 5 6 7 8 9 1 1 -0.50606 | **********| . | 2 0.09196 | . |** . | 3 0.06683 | . |* . | 4 -0.14221 | .***| . | 5 0.16250 | . |***. | 6 -0.07833 | . **| . | 7 -0.02154 | . | . | 8 0.10714 | . |** . | 9 -0.03611 | . *| . | 10 0.03881 | . |* . | 11 -0.04858 | . *| . | 12 0.00989 | . | . | 13 0.09922 | . |** . | 14 -0.09950 | . **| . | 15 0.11284 | . |** . | Partial Autocorrelations Lag Correlation -1 9 8 7 6 5 4 3 2 1 0 1 2 3 4 5 6 7 8 9 1 1 0.56498 | . |*********** | 2 -0.14170 | .***| . | 3 0.02814 | . |* . | 4 -0.01070 | . | . | 5 -0.11912 | . **| . | 6 -0.00838 | . | . | 7 -0.17970 | ****| . | 8 -0.11159 | . **| . | 9 0.02214 | . | . | 10 -0.01280 | . | . | 11 -0.07174 | . *| . | 12 -0.06860 | . *| . | 13 -0.02706 | . *| . | 14 0.07718 | . |** . | 15 0.04869 | . |* . |

  17. EXAMPLE USING SIMULATED SERIES 2 • Simulated 100 data from AR(1) where =0.5 and take a first order difference. • SAS output Autocorrelations Lag Covariance Correlation -1 9 8 7 6 5 4 3 2 1 0 1 2 3 4 5 6 7 8 9 1 Std Error 0 1.301676 1.00000 | |********************| 0 1 -0.133104 -.10226 | . **| . | 0.100504 2 -0.296746 -.22797 | *****| . | 0.101549 3 -0.131524 -.10104 | . **| . | 0.106593 4 0.080946 0.06219 | . |* . | 0.107557 5 -0.116677 -.08964 | . **| . | 0.107919 6 0.080503 0.06185 | . |* . | 0.108669 7 -0.016109 -.01238 | . | . | 0.109024 8 -0.176930 -.13592 | .***| . | 0.109038 9 -0.055488 -.04263 | . *| . | 0.110736 10 0.136477 0.10485 | . |** . | 0.110902 11 0.022838 0.01754 | . | . | 0.111898 12 -0.067697 -.05201 | . *| . | 0.111926 13 -0.117708 -.09043 | . **| . | 0.112170 14 0.013985 0.01074 | . | . | 0.112904 15 0.0086790 0.00667 | . | . | 0.112914

  18. Inverse Autocorrelations Lag Correlation -1 9 8 7 6 5 4 3 2 1 0 1 2 3 4 5 6 7 8 9 1 1 0.58314 | . |************ | 2 0.60399 | . |************ | 3 0.56860 | . |*********** | 4 0.46544 | . |********* | 5 0.51176 | . |********** | 6 0.43134 | . |********* | 7 0.40776 | . |******** | 8 0.42360 | . |******** | 9 0.36581 | . |******* | 10 0.33397 | . |******* | 11 0.28672 | . |****** | 12 0.27159 | . |***** | 13 0.26072 | . |***** | 14 0.16769 | . |***. | 15 0.17107 | . |***. | Partial Autocorrelations Lag Correlation -1 9 8 7 6 5 4 3 2 1 0 1 2 3 4 5 6 7 8 9 1 1 -0.10226 | . **| . | 2 -0.24095 | *****| . | 3 -0.16587 | .***| . | 4 -0.03460 | . *| . | 5 -0.16453 | .***| . | 6 0.01299 | . | . | 7 -0.06425 | . *| . | 8 -0.18066 | ****| . | 9 -0.11338 | . **| . | 10 -0.03592 | . *| . | 11 -0.05754 | . *| . | 12 -0.08183 | . **| . | 13 -0.17169 | .***| . | 14 -0.11056 | . **| . | 15 -0.13018 | .***| . |

  19. THE EXTENDED SAMPLE AUTOCORRELATION FUNCTION_ESACF • The extended sample autocorrelation function (ESACF) method can tentatively identify the orders of a stationary or nonstationary ARMA process based on iterated least squares estimates of the autoregressive parameters. Tsay and Tiao (1984) proposed the technique.

  20. ESACF • Consider ARMA(p, q) model or then follows an MA(q) model

  21. ESACF • Given a stationary or nonstationary time series Yt with mean corrected form with a true autoregressive order of p+d and with a true moving-average order of q, we can use the ESACF method to estimate the unknown orders and by analyzing the sample autocorrelation functions associated with filtered series of the form

  22. ESACF • It is known that OLS estimators for ARMA process are not consistent so that an iterative procedure is proposed to overcome this. • The j-th lag of the sample autocorrelation function of the filtered series is the extended sample autocorrelation function, and it is denoted as

  23. ESACF ESACF TABLE

  24. ESACF • For an ARMA(p,q) process, we have the following convergence in probability, that is, for m=1,2,… and j=1,2,…, we have

  25. ESACF • Thus, the asymptotic ESACF table for ARMA(1,1) model becomes

  26. ESACF • In practice, we have finite samples and may not be exactly zero. However, we can use the Bartlett’s approximate formula for the asymptotic variance of . • The orders are tentatively identified by finding a right (maximal) triangular pattern with vertices located at (p+d, q) and (p+d, qmax) and in which all elements are insignificant (based on asymptotic normality of the autocorrelation function). The vertex (p+d, q) identifies the order.

  27. EXAMPLE (R CODE) > x=arima.sim(list(order = c(2,0,0), ar = c(-0.2,0.6)), n = 200) > par(mfrow=c(2,1)) > par(mfrow=c(1,2)) > acf(x) > pacf(x)

  28. EXAMPLE (CONTD.) • After Loading Package TSA in R: > eacf(x) AR/MA 0 1 2 3 4 5 6 7 8 9 10 11 12 13 0 x x x x x x x x x x x x x x 1 x x x x x o o o o o o o o o 2 oo o o o o o o o o o o o o 3 x o o o o o o o o o o o o o 4 x x o o o o o o o o o o o o 5 x o x o o o o o o o o o o o 6 x x o x o o o o o o o o o o 7 x x o x o o o o o o o o o o

  29. MINIMUM INFORMATION CRITERION MINIC TABLE

  30. MINIC EXAMPLE • Simulated 100 data from AR(1) where =0.5 • SAS Output Minimum Information Criterion Lags MA 0 MA 1 MA 2 MA 3 MA 4 MA 5 AR 0 0.366884 0.074617 0.06748 0.083827 0.11816 0.161974 AR 1 -0.03571 -0.00042 0.038633 0.027826 0.064904 0.097701 AR 2 -0.0163 0.021657 0.064698 0.072834 0.107481 0.140204 AR 3 0.001216 0.034056 0.080065 0.118677 0.152146 0.183487 AR 4 0.037894 0.069766 0.115222 0.14586 0.189454 0.229528 AR 5 0.065179 0.099543 0.143406 0.185604 0.230186 0.272322 Error series model: AR(8) Minimum Table Value: BIC(1,0) = -0.03571

  31. NON-STATIONARY TIME SERIES MODELS • Non-constant in mean • Non-constant in variance • Both

  32. NON-STATIONARY TIME SERIES MODELS • Inspection of the ACF serves as a rough indicator of whether a trend is present in a series. A slow decay in ACF is indicative of a large characteristic root; a true unit root process, or a trend stationary process. • Formal tests can help to determine whether a system contains a trend and whether the trend is deterministic or stochastic.

  33. NON-STATIONARITY IN MEAN • Deterministic trend • Detrending • Stochastic trend • Differencing

  34. DETERMINISTIC TREND • A deterministic trend is when we say that the series is trending because it is an explicit function of time. • Using a simple linear trend model, the deterministic (global) trend can be estimated. This way to proceed is very simple and assumes the pattern represented by linear trend remains fixed over the observed time span of the series. A simple linear trend model:

  35. DETERMINISTIC TREND • The parameter  measure the average change in Yt from one period to the another: • The sequence {Yt} will exhibit only temporary departures from the trend line +t. This type of model is called a trend stationary (TS) model.

  36. EXAMPLE

  37. TREND STATIONARY • If a series has a deterministic time trend, then we simply regress Yt on an intercept and a time trend (t=1,2,…,n) and save the residuals. The residuals are detrended series. If Yt is stochastic, we do not necessarily get stationary series.

  38. DETERMINISTIC TREND • Many economic series exhibit “exponential trend/growth”. They grow over time like an exponential function over time instead of a linear function. • For such series, we want to work with the log of the series:

  39. DETERMINISTIC TREND • Standard regression model can be used to describe the phenomenon. If the deterministic trend can be described by a k-th order polynomial of time, the model of the process • Estimate the parameters and obtain the residuals. Residuals will give you the detrended series.

  40. DETERMINISTIC TREND • This model has a short memory. • If a shock hits a series, it goes back to trend level in short time. Hence, the best forecasts are not affected. • Rarely model like this is useful in practice. A more realistic model involves stochastic (local) trend.

  41. STOCHASTIC TREND • A more modern approach is to consider trends in time series as a variable. A variable trend exists when a trend changes in an unpredictable way. Therefore, it is considered as stochastic.

  42. STOCHASTIC TREND • Recall the AR(1) model: Yt = c + Yt−1 + at. • As long as || < 1, everything is fine (OLS is consistent, t-stats are asymptotically normal, ...). • Now consider the extreme case where  = 1, i.e. Yt = c + Yt−1 + at. • Where is the trend? No t term.

  43. STOCHASTIC TREND • Let us replace recursively the lag of Yt on the right-hand side: Deterministic trend • This is what we call a “random walk with drift”. If c = 0, it is a“random walk”.

  44. STOCHASTIC TREND • Each ai shock represents shift in the intercept. Since all values of {ai} have a coefficient of unity, the effect of each shock on the intercept term is permanent. • In the time series literature, such a sequence is said to have a stochastic trend since each ai shock imparts a permanent and random change in the conditional mean of the series. To be able to define this situation, we use Autoregressive Integrated Moving Average (ARIMA) models.

  45. DETERMINISTIC VS STOCHASTIC TREND • They might appear similar since they both lead to growth over time but they are quite different. • To see why, suppose that through any policies, you got a bigger Yt because the noise at is big. What will happen next period? – With a deterministic trend, Yt+1 = c +(t+1)+at+1. The noise at is not affecting Yt+1. Your policy had a one period impact. – With a stochastic trend, Yt+1 = c + Yt + at+1 = c + (c + Yt−1 + at) + at+1. The noise at is affecting Yt+1. In fact, the policy will have a permanent impact.

  46. DETERMINISTIC VS STOCHASTIC TREND Conclusions: – When dealing with trending series, we are always interested in knowing whether the growth is a deterministic or stochastic trend. – There are also economic time series that do not grow over time (e.g., interest rates) but we will need to check if they have a behavior ”similar” to stochastic trends ( = 1 instead of || < a, while c = 0). – A deterministic trend refers to the long-term trend that is not affected by short term fluctuations in the series. Some of the occurrences are random and may have a permanent effect of the trend. Therefore the trend must contain a deterministic and a stochastic component.

  47. DETERMINISTIC TREND EXAMPLE Simulate data from let’s say AR(1): >x=arima.sim(list(order = c(1,0,0), ar = 0.6), n = 100) Simulate data with deterministic trend >y=2+time(x)*2+x >plot(y)

  48. DETERMINISTIC TREND EXAMPLE > reg=lm(y~time(y)) > summary(reg) Call: lm(formula = y ~ time(y)) Residuals: Min 1Q Median 3Q Max -2.74091 -0.77746 -0.09465 0.83162 3.27567 Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 2.179968 0.250772 8.693 8.25e-14 *** time(y) 1.995380 0.004311 462.839 < 2e-16 *** --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 1.244 on 98 degrees of freedom Multiple R-squared: 0.9995, Adjusted R-squared: 0.9995 F-statistic: 2.142e+05 on 1 and 98 DF, p-value: < 2.2e-16

  49. DETERMINISTIC TREND EXAMPLE > plot(y=rstudent(reg),x=as.vector(time(y)), ylab='Standardized Residuals',xlab='Time',type='o')

  50. DETERMINISTIC TREND EXAMPLE > z=rstudent(reg) > par(mfrow=c(1,2)) > acf(z) > pacf(z) De-trended series AR(1)

More Related