1 / 20

Econometrics

Econometrics. Lecture Notes Hayashi, Chapter 6g Serial Correlation in GMM . The Model. Consider a single equation GMM model: y t = z t d + e t The model allows for random regressors, with instruments x t . The model allows for conditional heteroscedasticity and serial correlation.

sandra_john
Télécharger la présentation

Econometrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Econometrics Lecture Notes Hayashi, Chapter 6g Serial Correlation in GMM

  2. The Model • Consider a single equation GMM model:yt = ztd + et • The model allows for random regressors, with instruments xt. • The model allows for conditional heteroscedasticity and serial correlation. • Let gt = xtet

  3. Serial Correlation • {gt}, E(gt) = 0 • Serial correlation in {et} and hence in {gt}: Gj = E(gtgt-j’), j=0,1,2,… • {gt} satisfies Gordin’s condition. • The long-run covariance matrix S = j=-,…,Gj = G0 + j=1,…,(Gj + Gj’) is nonsingular.

  4. Serial Correlation • Given a positive definite weighting matrix W, GMM estimator dGMMW of d is consistent and asympototically normal. • With the consistent estimate of S, the asymptotic variance of dGMMW is heteroscedasticity and autocorrelation consistent (HAC). • The GMM estimator achieves the minimum variance when plimnW* = S-1.

  5. Serial Correlation • Given the consistent estimate of S, statistics for GMM model specification tests such as t, W, J, C, and LR remains valid and retain the same asymptotic distributions in the presence of serial correlation. • What we need is to be able consistently estimate the long-run variance matrix S.

  6. Estimating S • Consistently estimate the individual autocovariances:Gj = (1/n) t=j+1,…,nĝtĝt-j’ (j=0,1,…n-1)where ĝt = xtet and et = yt-ztdGMM • If the lag length q is known, then Gj = 0 for j>q. Therefore,S = j=-q,…,qGj = G0 + j=1,…,q(Gj + Gj’)

  7. Estimating S • If q is not known, there are several approaches of kernel estimation available:S = j=-n+1,…,n-1k(j/q(n))Gjwhere k(.) is the kernel and q(n) is the bandwidth, which increases with the sample size. • Truncated Kernel:k(x) = 1 for |x|1; 0 for |x|>1.This truncated kernel-based S is not guaranteed to be positive semidefinite in finite sample.

  8. Estimating S • Bartlett Kernelk(x) = 1-|x| for |x|1; 0 for |x|>1.The Bartlett kernel-based S is called the Newey-West estimator.S can be made nonnegative definite in finite sample. For example, for q(n)=3, we haveS = G0 + (2/3)(G1 + G1’) + (1/3)(G2 + G2’)

  9. Estimating S • Quadratic Spectral (QS) KernelSince k(x)0 for |x|>1 in the QS kernel, all the estimated autocovariances Gj (j=0,1,…,n-1) enter the calculation of S even if q(n)<n-1.

  10. Conditional Homoscedasticity • {gt}, gt = xtet et • E(gtgt-j’)  E(etet-j) • Conditional HomoscedasticityWith serial correlation,E(etet-j)|xt,xt-j) = wj (j=0,1,2,…)E(gtgt-j’) = E(etet-jxtxt-j’) = wjE(xtxt-j’) = Gj

  11. Conditional Homoscedasticity • Estimating Gj • Let et be the estimated et or residual; wj is consistently estimated by (1/n)t=j+1,…netet-j. • E(xtxt-j’) is consistently estimated by (1/n)t=j+1,…nxtxt-j’. • Gj = [(1/n)t=j+1,…netet-j][(1/n)t=j+1,…nxtxt-j’] • Estimating SS = G0 + j=1,…,q(Gj + Gj’)

  12. Conditional Homoscedasticity • Let W is the autocovariance matrix of {et}

  13. Conditional Homoscedasticity • If the lag length q is known, then Gj = 0 for j>q. Therefore, • If q is not known, for Bartlett kernel, let

  14. Conditional Homoscedasticity • The single equation GMM under conditional homoscedasticity and serial correlation is the 2SLS with serial correlation. • Let X = [xt, t=1,2,…,n]’. Z and y are the data matrices of the regressors and the dependent variable.

  15. Conditional Homoscedasticity • d2SLS = [Z’X(X’WX)-1X’Z]-1Z’X(X’WX)-1X’y • The consistent estimate of the asymptotic variance-covariance matrix of d2SLS is [Z’X(X’WX)-1X’Z]-1 • If Z = X, d2SLS = [Z’Z]-1Z’y = dOLS with the consistent estimate of variance-covariance matrix [Z’Z(Z’WZ)-1Z’Z]-1

  16. Conditional Homoscedasticity • Given that we have a consistentestimator ofW, dGLS = [Z’ W-1Z]-1Z’ W-1y • The consistency of GLS estimator is not guaranteed. • However, there is one important special case where GLS is consistent, and that is when the error is a finite-order autoregressive process. • GLS estimation for the AR(p) error process.

  17. GLS for AR(1) Process • The Model • yt = ztd + et • et = fet-1 + ut • Autocovariances • g0 = s2/(1- f2) • g1 = fg0 = s2 f/(1- f2) • gj = fgj-1 = s2 fj/(1- f2)for j>1

  18. GLS for AR(1) Process • Var(et) = s2/(1- f2)V • V-1 = C’C

  19. GLS for AR(1) Process • y* = Cy, Z* = CZ, e* = Ce = u ~ N(0,s2I) • y* = Z* d + u

  20. GLS for AR(1) Process • GLS estimator dGLS = OLS estimator for the transformed model: y* = Z* d + u • dGLS = (Z*’Z*)-1Z*’y* • Var(dGLS) = s2 (Z*’Z*)-1 • Est[Var(dGLS)] = s2 (Z*’Z*)-1 • s2 = (y*-Z*dGLS)’(y*-Z*dGLS)/(n-L)

More Related