1 / 7

Long memory or long range dependence

Long memory or long range dependence. ARMA models are characterized by an exponential decay in the autocorrelation structure as the lag goes to infinity. Long memory processes are stationary time series that exhibit a slower decay in the autocorrelation structure

Télécharger la présentation

Long memory or long range dependence

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Long memory or long range dependence • ARMA models are characterized by an exponential decay in the autocorrelation structure as the lag goes to infinity. • Long memory processes are stationary time series that exhibit a slower decay in the autocorrelation structure • The sum of the autocorrelation over all lags is infinite. K. Ensor, STAT 421

  2. Behavior of the spectral density • The spectral density has a special structure as it approaches 0. • The Hurst coefficient is defined as • H close to 1 implies longer memory K. Ensor, STAT 421

  3. Fractionally integrated models A class of models that exhibit this same behavior • |d|>.5 implies r is nonstationary • 0<d<.5 implies r is stationary with long memory also d=H-.5 • -.5<d<0 implies r is stationary with short memory K. Ensor, STAT 421

  4. Prediction • Write as AR() • Requires truncation at p lags • Or re-estimate the filter coefficients based on a lag of p for the given long memory covariance structure • This can be accomplished using the Durbin-Levinson recursive algorithm which takes you from the autocorrelation to the coefficients (and vice versa). • Note similar to the argument we had early on in the autoregessive setting using the Yule-Walker equations to come up with our optimal prediction coefficients. K. Ensor, STAT 421

  5. Long memory extended to GARCH and EGARCH models • Persistence in the volatility can be modeled by extending these long memory constructs to the volatility models such as the GARCH and EGARCH. • See Zivot’s manual for further details. K. Ensor, STAT 421

  6. Testing for persistence • R/S Statistic • Range of deviations from the mean rescaled by the standard deviation. • When rescaled and r’s are iid normal this statistic onverges to the range of a Brownian bridge K. Ensor, STAT 421

  7. GPH Test • Base on the behavior of the spectral density as it approaches 0. • See section 8.3 and 8.4 of Zivot’s manual. K. Ensor, STAT 421

More Related