1 / 14

Basic Econometrics

Basic Econometrics. Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation. 7-1. The three-Variable Model: Notation and Assumptions. Y i = ß 1 + ß 2 X 2i + ß 3 X 3i + u i (7.1.1) ß 2 , ß 3 are partial regression coefficients With the following assumptions:

mrhodes
Télécharger la présentation

Basic Econometrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation Prof. Himayatullah

  2. 7-1. The three-Variable Model: Notation and Assumptions • Yi = ß1+ ß2X2i + ß3X3i + u i(7.1.1) • ß2 ,ß3 are partial regression coefficients • With the following assumptions: + Zero mean value of U i:: E(u i|X2i,X3i) = 0. i (7.1.2) + No serial correlation: Cov(ui,uj) = 0, i # j (7.1.3) + Homoscedasticity: Var(u i) = 2 (7.1.4) + Cov(ui,X2i) = Cov(ui,X3i) = 0 (7.1.5) + No specification bias or model correct specified (7.1.6) +No exact collinearity between X variables (7.1.7) (no multicollinearity in the cases of more explanatory vars. If there is linear relationship exits, X vars. Are said to be linearly dependent) + Model is linear in parameters Prof. Himayatullah

  3. 7-2. Interpretation of Multiple Regression • E(Yi|X2i ,X3i)= ß1+ ß2X2i + ß3X3i(7.2.1) • (7.2.1) gives conditional mean or expected value of Y conditional upon the given or fixed value of the X2 and X3 Prof. Himayatullah

  4. 7-3. The meaning of partial regression coefficients • Yi= ß1+ ß2X2i + ß3X3 +….+ ßsXs+ ui • ßk measures the change in the mean value of Y per unit change in Xk, holding the rest explanatory variablesconstant. It gives the “direct” effect of unit change in Xk on the E(Yi), net of Xj (j # k) • How to control the “true” effect of a unit change in Xk on Y? (read pages 195-197) Prof. Himayatullah

  5. 7-4. OLS and ML estimation of the partial regression coefficients • This section (pages 197-201) provides: 1. The OLS estimators in the case of three-variable regression Yi= ß1+ ß2X2i + ß3X3+ ui 2. Variances and standard errors of OLS estimators 3. 8 properties of OLS estimators (pp 199-201) 4. Understanding on ML estimators Prof. Himayatullah

  6. 7-5. The multiple coefficient of determination R2 and the multiple coefficient of correlation R • This section provides: 1. Definition of R2 in the context of multiple regression like r2 in the case of two-variable regression 2. R = R2 is the coefficient of multiple regression, it measures the degree of association between Y and all the explanatory variables jointly 3. Variance of a partial regression coefficient Var(ß^k) = 2/ x2k (1/(1-R2k)) (7.5.6) Where ß^k is the partial regression coefficient of regressor Xk and R2kis the R2 in the regression of Xk on the rest regressors Prof. Himayatullah

  7. 7-6. Example 7.1: The expectations-augmented Philips Curve for the US (1970-1982) • This section provides an illustration for the ideas introduced in the chapter • Regression Model (7.6.1) • Data set is in Table 7.1 Prof. Himayatullah

  8. 7-7. Simple regression in the context of multiple regression: Introduction to specification bias • This section provides an understanding on “ Simple regression in the context of multiple regression”. It will cause the specification bias which will be discussed in Chapter 13 Prof. Himayatullah

  9. 7-8. R2 and the Adjusted-R2 • R2 is a non-decreasing function of the number of explanatory variables. An additional X variable will not decrease R2 R2= ESS/TSS = 1- RSS/TSS = 1-u^2I / y^2i (7.8.1) • This will make the wrong direction by adding more irrelevant variables into the regression and give an idea for an adjusted-R2 (R bar) by taking account of degree of freedom • R2bar= 1- [ u^2I /(n-k)] / [y^2i /(n-1) ] , or(7.8.2) R2bar= 1- ^2/ S2Y (S2Y is sample variance of Y) K= number of parameters including intercept term • By substituting (7.8.1) into (7.8.2) we get R2bar = 1- (1-R2) (n-1)/(n- k) (7.8.4) • For k > 1, R2bar < R2thuswhen number of X variables increases R2bar increases less than R2 and R2bar can be negative Prof. Himayatullah

  10. 7-8. R2 and the Adjusted-R2 • R2 is a non-decreasing function of the number of explanatory variables. An additional X variable will not decrease R2 R2= ESS/TSS = 1- RSS/TSS = 1-u^2I / y^2i (7.8.1) • This will make the wrong direction by adding more irrelevant variables into the regression and give an idea for an adjusted-R2 (R bar) by taking account of degree of freedom • R2bar= 1- [ u^2I /(n-k)] / [y^2i /(n-1) ] , or(7.8.2) R2bar= 1- ^2/ S2Y (S2Y is sample variance of Y) K= number of parameters including intercept term • By substituting (7.8.1) into (7.8.2) we get R2bar = 1- (1-R2) (n-1)/(n- k) (7.8.4) • For k > 1, R2bar < R2thuswhen number of X variables increases R2bar increases less than R2 and R2bar can be negative Prof. Himayatullah

  11. 7-8. R2 and the Adjusted-R2 • Comparing Two R2 Values: To compare, the size n and the dependent variable must be the same • Example 7-2: Coffee Demand Function Revisited (page 210) • The “game” of maximizing adjusted-R2:Choosing the model that gives the highest R2bar may be dangerous, for in regression our objective is not for that but for obtaining the dependable estimates of the true population regression coefficients and draw statistical inferences about them • Should be more concerned about the logical or theoretical relevance of the explanatory variables to the dependent variable and their statistical significance Prof. Himayatullah

  12. 7-9. Partial Correlation Coefficients • This section provides: 1. Explanation of simple and partial correlation coefficients 2. Interpretation of simple and partial correlation coefficients (pages 211-214) Prof. HimayatullahProf. Himayatullah

  13. 7-10. Example 7.3: The Cobb-Douglas Production functionMore on functional form • Yi = 1X22i X33ieUi (7.10.1) By log-transform of this model: • lnYi = ln1 + 2ln X2i + 3ln X3i + Ui = 0 + 2ln X2i + 3ln X3i + Ui(7.10.2) Data set is in Table 7.3 Report of results is in page 216 Prof. Himayatullah

  14. 7-11 Polynomial Regression Models • Yi = 0 + 1 Xi + 2 X2i+…+ k Xki + Ui (7.11.3) • Example 7.4: Estimating the Total Cost Function • Data set is in Table 7.4 • Empirical results is in page 221 -------------------------------------------------------------- • 7-12. Summary and Conclusions (page 221) Prof. Himayatullah

More Related