1 / 10

Econ 427 lecture 12 slides

Econ 427 lecture 12 slides. MA (part 2) and Autoregressive Models. Moving Average (MA) models. Last time we looked at moving average models. MA(1) Past shocks ( innovations ) in the series feed into the succeeding period. Properties of an MA(1) series.

rosaline
Télécharger la présentation

Econ 427 lecture 12 slides

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Econ 427 lecture 12 slides MA (part 2) and Autoregressive Models

  2. Moving Average (MA) models • Last time we looked at moving average models. • MA(1) • Past shocks (innovations) in the series feed into the succeeding period.

  3. Properties of an MA(1) series • An MA(1) has a “short memory”—only last period’s shock matters for today • We saw this in the shape of the autocorrelation function: • There is one signif bar in the autocor graph

  4. MA(q) series • Higher order MA processes involve additional lags of white noise: • What does the autocorrelation function for an MA(q) look like?

  5. Autoregressive Models • Relates the current value of a series to its own past lags. An AR(1) is: How would I write that in lag operator form? We would like to know what its time-series properties are. How can we figure that out?

  6. Properties of AR(1) Model • Transform it into an expression involving lags of epsilon by “backward substitution”:

  7. Properties of AR(1) Model • So we can write • Or: As long as • The last step comes from the fact that the summation is a geometric series. See http://mathworld.wolfram.com/GeometricSeries.html

  8. Autoregressive Models • Notice that you can use algebra on the original AR(1) expression in lag operator form to get this same result In lag operator form: Divide both sides by the expression in parentheses (the lag polynomial)

  9. Properties of AR(1) Model • Key properties are (see book, p. 146-147): • Why is this last result important? Does it look familiar? • The variance will only be finite if |phi| < 1. Covariance stationarity requires this. Intuition, if phi = 1, the series can wander infinitely far away from its starting point, since any shock is permanent.

  10. AR(p) series • Higher order AR processes involve additional lags of y: • What do the autocorrelation and partial autocorrelation functions for an AR(p) look like?

More Related