1 / 5

5. Time series 5.1 Nature of the data

5. Time series 5.1 Nature of the data Now  temporal order , past data can affect future data (but not inversely) Randomness in C.S.: ≠ samples  ≠ estimations/LS r.v. Randomness in T.S.: succession of indexed r.v. (t)  stochastic process

zahur
Télécharger la présentation

5. Time series 5.1 Nature of the data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 5. Time series 5.1 Nature of the data Now  temporal order, past data can affect future data (but not inversely) Randomness in C.S.: ≠ samples  ≠ estimations/LS r.v. Randomness in T.S.: succession of indexed r.v. (t)  stochastic process We can only see one “realization” of the r.v. since we cannot go back in t  other conditions  ≠ realizations Population: the collection of all possible realizations Sample: no. of periods considered

  2. Examples: • Static model • Finite distributed lag model • Autoregressive model AR(p) 5.2 Trends and seasonality Trends Most of the series present a trend in t Sometimes we can conclude that two variables are related but in reality the two have a trend (spurious regression) A trend represents unobserved factors, and the associated coefficient represents the change in Y from one period to another

  3. Seasonality When a variable is observed frequently  take the season out (dummy) 5.3 Stationarity A definition of “weak” stationarity stems from: • A constant expected value • A constant variance • Autocovariance depends on lags and not on t • More, ergodicity (“to forget”)  the more the variables are separated in t their correlation becomes smaller Ergodicity replaces the assumption of random sample, thus securing the law of the big numbers and the CLT

  4. Some examples (stochastic processes) • White noise: succession of r.v., E(y)=0 & Var(y)= ct. indep. in t • Random walk: 1º differences are white noise • Mobil average (MA,q): weighted average of “noises” • Autoregressive (AR,p): lags of same series • ARMA(p,q): AR(p)+MA(q) • ARIMA (p,d,q): NON stationary (d: no. of times we differentiate) 5.4 Box-Jenkins methodology Parsimony: to predict an univariate TS, simpler models produce best predictions. The methodology has 3 steps: • Identification (SAF & PAF) • Estimation (Akaike & Schwarz) • Diagnosis (what to do now?)

  5. In an stationary process the SAF & SAP are indep. in t, and decline rapidly towards 0 Box-Jenkins step-by-step • Calculate SAF & PAF of series and see their stationarity (if they are we move down to step 3, not 2) • Transform series (log&dif) and re-calculate SAF & SAP • Examine SAF & SAP and determine a starting point • Estimate the alternative (univariate) models • For each of the estimated models: • See if longest lag is significant (if not, reduce the order) • See SAF & SAP of errors • See Akaike, Schwarz, & adj-r2 of models (recall parsimony) • If we change our model move back to step 4

More Related