html5-img
1 / 20

Univariate analysis: Analysis of a the properties of a X and Y

Univariate analysis: Analysis of a the properties of a X and Y Bivariate analysis: Analysis of the relationship between X and Y. Definition of time series: a series of succesive observation over time; e.g. daily, monthly, yearly Output series: Y t Input series: X t

aysel
Télécharger la présentation

Univariate analysis: Analysis of a the properties of a X and Y

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Univariate analysis: Analysis of a the properties of a X and Y Bivariate analysis: Analysis of the relationship between X and Y

  2. Definition of time series: a series of succesive observation over time; e.g. daily, monthly, yearly Output series: Yt Input series: Xt One of several methods for analysing time series data: ARIMA (Autoregressive Integrated Moving Average) Box GEP & Jenkins GM. Time Series Analysis: Forecasting and Control. London: Holden-Day, 1976.

  3. Stationarity Trends may cause spurious relationships: Is the series stationary, that is, trend-free? Linear trends can be removed through differencing: Yt= Yt - Yt-1 In the case of non-linear trends, logging Y first may help

  4. Plot the series:- to assess stationarity- to detect extreme values Plot in Stata: • PLOT V1IND, V2IND, V3IND, YEAR>1958 twoway (line v1ind v2ind v3ind year) if year>1958 What is V1-V3; do they seem to be correlated?

  5. Non-linear trend PLOT AND TRANSFORMATIONS OF DEFLATOR twoway (line deflator year) gen deflatordif= d.deflator twoway (line deflatordif year) gen deflatorln= ln(deflator) regress deflatorln year twoway (line deflatorln year) gen deflatorlndif=d.deflatorln twoway (line deflatorlndif year)

  6. Basic time series models in ARIMA • White noise: a series of random shocks; et • Random walk (RW): cumulative sum of random shocks: Yt=Yt-1+et • Autoregressive process (AR): weighted sum of random shocks: Yt=Yt-1+et ; <1 • Moving Average process (MA): Yt=et+et-1 ; <1

  7. White noise: a series of random shocks where the succesive observations are uncorrelated with each other. Typical feature: the series jumps up and down in an unsystematic way. The value of et cannot be predicted from et-1

  8. White noise=building block Other series can be described as functions of white noise The residuals from a time series regression should be white noise

  9. Random walk (RW): cumulative sum of random shocks: Yt = et +et-1 +et-2 +et-3 +…et-n ;Yt=Yt-1+et.Typical feature: long-term trends, strong inertia. Best prediction of Yt is Yt-1 .

  10. Autoregressive process AR(1): weighted sum of random shocks: Yt = et +et-1 +2et-2 +3et-3 +…net-n;Yt=Yt-1+et ; <1. The higher the , the stronger the trends. Below: =0.9

  11. Moving Average process MA(1): Yt=et+et-1 ; <1 Most real series can be modelled as RW, AR or MA. The residuals from a ARIMA regression of Y on X should be white noise. Exampes of RW: stock market indices, alcohol sales. Implication for prediction that Yt=Yt-1+et and thus that Yt=et

  12. Generate white noise in Stata: gen e1=uniform() Generate a random walk in Stata: gen rw = sum(e1)

  13. What model fits the series? Device: Autocorrelation function (ACF) The ACF for Xt is the correlation between Xt and lagged versions of Xt AC(1)=corr between Xt and Xt-1 AC(2)=corr between Xt and Xt-2 AC(3)=corr between Xt and Xt-3 … AC(n)=corr between Xt and Xt-n

  14. Definition of partial autocorrelation (PAC) for AR(1) Expected AC(2)=AC(1)* AC(1)

  15. ACF and PACF for white noise

  16. ACF and PACF for random walk

  17. ACF and PACF for autoregressive process AR(1)

  18. ACF and PACF for moving average MA(1)

  19. Univariate model estimation Model notation: (p,d,q); p=order of autoregressive parameters, d=order of differencing, q=order of moving average parameters. ACF and PACF suggest an AR(1). Estimate the model (1,0,0), that is estimate  in: Yt=Yt-1+et Diagnostic test: Check ACF of the residuals (et). Are they white noise? If not, modify the model.

More Related