1 / 27

What Can We Learn from Education Production Studies?

What Can We Learn from Education Production Studies?. E. Anthon Eff & Christopher C. Klein Forthcoming Eastern Economic Journal. Abstract.

ronni
Télécharger la présentation

What Can We Learn from Education Production Studies?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What Can We Learn from Education Production Studies? E. Anthon Eff & Christopher C. Klein Forthcoming Eastern Economic Journal

  2. Abstract • The results of a Becker/Peltzman/Stigler model of local school district decision-making yields biased or inconsistent efficiency measures when some school outputs are not measured. Empirical investigation of data for 95 Tennessee counties in the 1999-2000 academic year finds that DEA efficiency measures, and efficiency rankings based on those measures, are highly sensitive to changes in the number of output measures used. An artifact of the DEA process causes increasing correlation of efficiency scores with the inverse of per pupil expenditures as outputs increase. Hence, high-stakes policy initiatives should not be based on such scores.

  3. Education Production Literature • Voluminous literature, but none looks at effect of omitting outputs on efficiency scores • Pritchett & Filmer [1999] – input distortions • Wenger [2000] - multiple educational outputs • Ruggiero [2003], Bifulco & Bretschneider [2003] – measuring one output with error • DEA Literature – effect of omitted inputs • Galagedera and Silvapulle [2003]; Saen, Memariani, and Lotfi[2005]; Pedraja-Chaparro, Salinas-Jiménez, and Smith [1999]

  4. Theoretical Model • School district politicians maximize a political value function subject to a budget constraint • s.t. C(Q1, Q2; Z) = B(Z) • Q1 and Q2 are educational outputs • Z is the socioeconomic status for students

  5. Theoretical Model • f.o.c. imply • Evaluate dQi/dZ • Case I: (dQ1/dZ)/Q1 = (dQ2/dZ)/Q2 (Fig. 1) • Case II: BZ = CZ = 0 (Fig. 2) • Case III: BZ > 0, CZ < 0 (Fig. 3) • Compare efficiency measure when one output is observed to that of two outputs (Table 1)

  6. Figure 1

  7. Figure 2: BZ = CZ = 0

  8. Figure 3: BZ > 0, CZ < 0

  9. Figure 4

  10. Empirical Implications • Choice of school outputs changes as socioeconomic conditions of students change • Failure to measure all outputs will bias efficiency measures against districts with many unmeasured outputs • Districts with many unmeasured outputs will appear less efficient • Efficiency measures and rankings will change as more outputs are measured

  11. DATA: Table 2 • County level data for Tennessee public schools in the year 2000 • 20 initial academic and sports variables are reduced to six factors by applying an oblique rotation to the principal components • Six additional variables are constructed from inputs, other outputs, and demographics • Total of 12 consolidated output measures

  12. DEA Efficiency Scores • Maximize the ratio of weighted outputs to weighted inputs by selecting weights • Every combination of 12 outputs = 4,095 • For each of 95 Tennessee counties = 389,025 efficiency scores

  13. Second Stage Tobit Procedure • In an attempt to purge the raw DEA scores of the effects of non-discretionary factors, the residuals of a tobit regression are analyzed • Where Z is input and cost factors; X is output demand factors • The results are shown in Table 4

  14. Effects of Omitting Outputs • As the number of outputs increases, the median difference from the 12-output efficiency measure falls and the range of differences shrinks. • As the number of outputs increases, the median change in efficiency rank remains at zero, but the range shrinks. • Even with 11 outputs, ranks for some counties can differ considerably from the ranks given by the 12-output measure. • Rankings change as new outputs are introduced. A newly introduced output is produced heavily in some counties and not in others, so that its inclusion increases efficiency more in some counties than in others.

  15. Figure 6

  16. Collinearity of Missing Outputs and Included Outputs • Omitting an output collinear with measured outputs might have little effect on efficiency. • For each county, we compare 4,094 efficiency measures with fewer than 12 outputs with an efficiency measure produced by adding exactly one output, for a total of 24,564 comparisons. • Doing this for each of the 95 counties, we obtain 2,333,580 observations. • Table 5 shows the correlation of the R2+ between the newly entered output and the preexisting outputs with Δθ, the change in efficiency resulting from including the new output. • All but two show the expected negative correlation, but the magnitude is relatively small compared to the correlation between Δθ and the county production of the output. • Including an output for which a county has a comparative advantage notably increases its efficiency score, even when that output is collinear with included outputs.

  17. Resources, Outputs, and Efficiency • Contrary to our expectations, the correlation between per pupil expenditure and θ becomes more strongly negative as more outputs are added. (See Fig. 6) • In other words, as more outputs are included, money has progressively less effect on outputs, so that the most efficient counties are those who simply spend the least.

  18. An Artifact of DEA? • As one adds outputs to the DEA numerator, the variance of the numerator decreases, but the variance of the denominator is constant. • The variance of the numerator is the sum of the covariances between outputs, weighted by the product of the output weights. • The output weights ur are less than one and decline as outputs are added, making their product progresively smaller.

  19. An Artifact of DEA? • where is the square of the mean of u, is the square of the mean of w, Eij= E[(Δu)i(Δw)j], and Δu=u- and Δw=w- . • As more outputs are added, Var(u) and E22 become progressively smaller, Var(w) is unchanged, and all else changes very little. • As a result, θ will correlate ever more strongly with w—the scaled inverse of per pupil expenditures – as outputs are added.

  20. An Artifact of DEA? • Thus, the “resources don’t matter” conclusion of the education production literature is an artifact of the DEA model—an artifact that becomes more of a problem the more outputs one attempts to include. • Can we use the tobit procedure to correct efficiency scores limited to academic outputs for unmeasured outputs? • We still cannot distinguish counties with high demand for expensive outputs from those that are simply inefficient (Fig. 7).

  21. Conclusions • Empirical analysis generally supports the implications of the theoretical model • Different counties emphasize different outputs • Efficiency increases as more outputs are included • Efficiency rankings of counties also change—in unpredictable ways—as new outputs are included. • High SES counties spend the most per student, are likely to have the most unmeasured outputs, and tend to have the lowest efficiency.

  22. Conclusions • Without fully accounting for the output space, one cannot know the true efficiency score, and cannot even know a provider’s true efficiency rank. • Due to a quirk of DEA, adding new outputs is likely to exaggerate the inefficiency of high-spending districts. • A tobit correction for variations in output demand may be misleading, if the forces shaping that demand are not well understood.

  23. Conclusions • Efficiency measures are currently too unreliable to use in state funding formulas or in any high-stakes educational policy, as some propose. • Develop more accurate efficiency measures: • How best to control the variances in a DEA measure, so that efficiency is not spuriously correlated with per pupil expenditure? • Understand the demand for outputs within local school districts, so that omitted outputs can be accounted for within a second-stage regression procedure.

More Related