280 likes | 679 Vues
Applied Econometrics. 3. Linear Least Squares. Vocabulary. Some terms to be used in the discussion.Population characteristics and entities vs. sample quantities and analogsResiduals and disturbancesPopulation regression line and sample regressionObjective: Learn about the conditional mean func
E N D
1. Applied Econometrics William Greene
Department of Economics
Stern School of Business
2. Applied Econometrics 3. Linear Least Squares
3. Vocabulary Some terms to be used in the discussion.
Population characteristics and entities vs. sample quantities and analogs
Residuals and disturbances
Population regression line and sample regression
Objective: Learn about the conditional mean function. Estimate ? and ?2
First step: Mechanics of fitting a line (hyperplane) to a set of data
4. Fitting Criteria The set of points in the sample
Fitting criteria - what are they:
LAD
Least squares
and so on
Why least squares? (We do not call it ordinary at this point.)
A fundamental result:
Sample moments are good estimators of
their population counterparts
We will spend the next few weeks using this principle and applying it to least squares computation.
5. An Analogy Principle In the population E[y | X ] = X? so
E[y - X? |X] = 0
Continuing E[xi ?i] = 0
Summing, Si E[xi ?i] = Si 0 = 0
Exchange Si and E E[Si xi ?i] = E[ X?? ] = 0
E[ X? (y - X?) ] = 0
Choose b, the estimator of ? to mimic this population result: i.e., mimic the population mean with the sample mean
Find b such that
As we will see, the solution is the least squares coefficient vector.
6. Population and Sample Moments We showed that E[?i|xi] = 0 and Cov[xi,?i] = 0. If it is, and if E[y|X] = X?, then
? = (Var[xi])-1 Cov[xi,yi].
This will provide a population analog to the statistics we compute with the data.
8. Least Squares Example will be, yi = Gi on
xi = [a constant, PGi and Yi] = [1,Pgi,Yi]
Fitting criterion: Fitted equation will be
yi = b1xi1 + b2xi2 + ... + bKxiK.
Criterion is based on residuals:
ei = yi - b1xi1 + b2xi2 + ... + bKxiK
Make ei as small as possible.
Form a criterion and minimize it.
9. Fitting Criteria Sum of residuals:
Sum of squares:
Sum of absolute values of residuals:
Absolute value of sum of residuals
We focus on now and later
10. Least Squares Algebra
11. Least Squares Normal Equations
12. Least Squares Solution
13. Second Order Conditions
14. Does b Minimize ee?
15. Sample Moments - Algebra
16. Positive Definite Matrix
17. Algebraic Results - 1
18. Residuals vs. Disturbances
19. Algebraic Results - 2 A residual maker M = (I - X(XX)-1X)
e = y - Xb= y - X(XX)-1Xy = My
MX = 0 (This result is fundamental!)
How do we interpret this result in terms of residuals?
(Therefore) My = MXb + Me = Me = e
(You should be able to prove this.
y = Py + My, P = X(XX)-1X = (I - M).
PM = MP = 0.
Py is the projection of y into the column space of X.
20. The M Matrix M = I- X(XX)-1X is an nxn matrix
M is symmetric M = M
M is idempotent M*M = M
(just multiply it out)
M is singular M-1 does not exist.
(We will prove this later as a side result in another derivation.)
21. Results when X Contains a Constant Term X = [1,x2,,xK]
The first column of X is a column of ones
Since Xe = 0, x1e = 0 the residuals sum to zero.
22. Least Squares Algebra
23. Least Squares
24. Residuals
25. Least Squares Residuals
26. Least Squares Algebra-3
27. Least Squares Algebra-4