1 / 190

Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology. Jeffrey R. Edwards University of North Carolina. Outline. Types of Difference Scores Questions Difference Scores Are Intended To Address Problems With Difference Scores An Alternative Procedure

huslu
Télécharger la présentation

Alternatives to Difference Scores: Polynomial Regression and Response Surface Methodology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Alternatives to Difference Scores:Polynomial Regression and Response Surface Methodology Jeffrey R. Edwards University of North Carolina

  2. Outline • Types of Difference Scores • Questions Difference Scores Are Intended To Address • Problems With Difference Scores • An Alternative Procedure • The Matrix Approach to Testing Constraints • Analyzing Quadratic Regression Equations Using Response Surface Methodology • Moderated Polynomial Regression • Mediated Polynomial Regression • Difference Scores As Dependent Variables • Answers to Frequently Asked Questions

  3. Types of Difference Scores:Univariate • Algebraic difference: (X – Y) • Absolute difference: |X – Y| • Squared difference: (X – Y)2

  4. Euclidean distance: • Profile correlation: Types of Difference Scores:Multivariate • Sum of algebraic differences: Σ(Xi – Yi) = D1 • Sum of absolute differences: Σ|Xi – Yi| = |D| • Sum of squared difference: Σ(Xi – Yi)2 = D2

  5. Questions Difference Scores are Intended to Address • How well do characteristics of the job fit the needs or desires of the employee? • To what extent do job demands exceed or fall short of the abilities of the person? • Are prior expectations of the employee met by actual job experiences? • What is the degree of similarity between perceptions or beliefs of supervisors and subordinates? • Do the values of the person match the culture of the organization? • Can novices provide performance evaluations that agree with expert ratings?

  6. Data Used for Running Illustration • Data were collected from 373 MBA students who were engaged in the recruiting process. • Respondents rated the actual and desired amounts of various job attributes and the anticipated satisfaction concerning a job for which they had recently interviewed. • Actual and desired measured had three items and used 7-point response scales ranging from “none at all” to “a very great amount.” The satisfaction measured had three items and used a 7-point response scale ranging from “strongly disagree” to “strongly agree.” • The job attributes used for illustration are autonomy, prestige, span of control, and travel.

  7. Problems with Difference Scores:Reliability • When component measures are positively correlated, difference scores are often less reliable than either component. • The formula for the reliability of an algebraic difference is (Johns, 1981):

  8. Problems with Difference Scores:Reliability • To illustrate, if X and Y have unit variances, have reliabilities of .75, and are correlated .50, the reliability of X – Y equals:

  9. Example: Reliability of the Algebraic Difference for Autonomy • For autonomy, the actual amount (X) and desired amount (Y) measures had reliabilities of .89 and .85, variances of 1.16 and 0.88, and a correlation of .51 Hence, the reliability of the algebraic difference (X – Y) is: • Note that this reliability is lower than the reliabilities of X and Y.

  10. Reliabilities of OtherTypes of Difference Scores • Reliabilities of other difference scores can be estimated using procedures for the reliabilities of squares, products, and linear combinations of variables. • For example, a squared difference can be written as a linear combination of X2, XY, and Y2: (X – Y)2 = X2 – 2XY + Y2 • The reliability of this expression can be derived by combining procedures described by Nunnally (1978) and Bohrnstedt and Marwell (1978).

  11. Reliabilities of OtherTypes of Difference Scores • The reliabilities of profile similarity indices such as D1, |D|, and D2 can also be derived by applying Nunnally (1978) and Bohrnstedt and Marwell (1978). • The reliabilities of the squared and product terms that constitute |X – Y|, (X – Y)2, |D|, and D2 involve the means of X and Y, which are arbitrary for measures that use interval rather than ratio scales. • Profile similarity indices usually collapse conceptually distinct dimensions, which obscures the meaning of their true scores and, thus, their reliabilities.

  12. Problems with Difference Scores:Conceptual Ambiguity • It might seem that component variables are reflected equally in a difference score, given that the components are implicitly assigned the same weight when the difference score is constructed. • However, the variance of a difference score depends on the variances and covariances of the component measures, which are sample dependent. • When one component is a constant, the variance of a difference score is solely due to the other component, i.e., the one that varies. For instance, when P-O fit is assessed in a single organization, the P-O difference solely represents variation in the person scores.

  13. Variance of an Algebraic Difference Score • The variance of an algebraic difference score can be computed using the following formula for the variance of a weighted linear combination of random variables: V(aX + bY) = a2V(X) + b2V(Y) + 2abC(X,Y) • For the algebraic difference score (X – Y), a = +1 and b = –1, which yields: V(X – Y) = V(X) + V(Y) – 2C(X,Y) • Thus, X and Y contribute equally to V(X – Y) only when V(X) and V(Y) happen to be equal.

  14. Example: Variance of the Algebraic Difference for Autonomy • For autonomy, the variance of X is 1.16, and the variance of Y is 0.88. The covariance between X and Y is their correlation multiplied by the products of their standard deviations, which equals .51 x 1.08 × 0.94 = 0.52. Using these quantities, the variance of (X – Y) is: V(X – Y) = 1.16 + 0.88 – 1.04 = 1.02 • V(X – Y) depends more on V(X) than V(Y) and also incorporates C(X,Y). Thus, V(X – Y) does not reflect V(X) and V(Y) in equal proportions.

  15. Variances of Other Types of Difference Scores • Variances of difference scores involving higher-order terms, such as (X – Y)2, can be computed using rules for the variances of products of random variables (Bohrnstedt & Goldberger, 1969; Goodman (1960). • These formulas involve the means of X and Y, which are arbitrary when X and Y are measured on interval rather than ratio scales. • Nonetheless, it is reasonable to assume that all components do not contribute equally, particularly when the number of components becomes large.

  16. Problems with Difference Scores:Confounded Effects • Difference scores confound the effects of the components of the difference. • For example, an equation using an algebraic difference as a predictor can be written as: Z = b0 + b1(X – Y) + e • In this equation, b1 can reflect a positive relationship for X, an negative relationship for Y, or some combination thereof.

  17. Problems with Difference Scores:Confounded Effects • Some researchers have attempted to address this confound by controlling for one component of the difference. For an algebraic difference, this yields: Z = b0 + b1X + b2(X – Y) + e • However, controlling for X simply transforms the algebraic difference into a partialled measure of Y (Wall & Payne, 1973): Z = b0 + (b1 + b2)X – b2Y + e • Thus, b2 is not the effect of (X – Y), but instead is the negative of the effect of Y, controlling for X.

  18. Problems with Difference Scores:Confounded Effects • The effects of X and Y are easier to interpret if X and Y are used as separate predictors: Z = b0 + b1X + b2Y + e • The R2 from this equation is the same as that from the equation using (X – Y) and X as predictors, but its interpretation is more straightforward.

  19. Example: Confounded Effects for the Algebraic Difference for Autonomy • Results using (X – Y): Dep Var: SAT N: 360 Multiple R: 0.339 Squared multiple R: 0.115 Adjusted squared multiple R: 0.113 Standard error of estimate: 1.082 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT 5.937 0.061 0.0 . 97.007 0.000 AUTALD 0.393 0.058 0.339 1.000 6.825 0.000 Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression 54.589 1 54.589 46.586 0.000 Residual 419.498 358 1.172

  20. Example: Confounded Effects for the Algebraic Difference for Autonomy • Results using X and (X – Y): Dep Var: SAT N: 360 Multiple R: 0.356 Squared multiple R: 0.127 Adjusted squared multiple R: 0.122 Standard error of estimate: 1.077 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT 5.835 0.077 0.000 . 75.874 0.000 AUTCA 0.145 0.066 0.134 0.650 2.187 0.029 AUTALD 0.301 0.071 0.260 0.650 4.235 0.000 Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression 60.133 2 30.067 25.930 0.000 Residual 413.953 357 1.160

  21. Example: Confounded Effects for the Algebraic Difference for Autonomy • Results using X and Y: Dep Var: SAT N: 360 Multiple R: 0.356 Squared multiple R: 0.127 Adjusted squared multiple R: 0.122 Standard error of estimate: 1.077 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT 5.835 0.077 0.000 . 75.874 0.000 AUTCA 0.445 0.062 0.413 0.737 7.172 0.000 AUTCD -0.301 0.071 -0.244 0.737 -4.235 0.000 Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression 60.133 2 30.067 25.930 0.000 Residual 413.953 357 1.160

  22. Problems with Difference Scores:Confounded Effects • Other researchers have controlled for X, Y, or (X – Y) when using |X – Y| or (X – Y)2 as predictors. For example: Z = b0 + b1(X – Y) + b2(X – Y)2 + e • Although this approach might seem to provide a conservative test for (X – Y)2, the term b1(X – Y) merely shifts the minimum of the U-shaped curve captured by (X – Y)2. Specifically, if b1 is positive, the minimum of the curve is shifted to the left, and if b1is negative, the minimum is shifted to the right.

  23. Example: Confounded Effects for the Squared Difference for Autonomy • Results using (X – Y)2: Dep Var: SAT N: 360 Multiple R: 0.310 Squared multiple R: 0.096 Adjusted squared multiple R: 0.093 Standard error of estimate: 1.094 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT 5.993 0.067 0.000 . 89.830 0.000 AUTSQD -0.183 0.030 -0.310 1.000 -6.162 0.000 Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression 45.463 1 45.463 37.972 0.000 Residual 428.623 358 1.197

  24. Example: Confounded Effects for the Squared Difference for Autonomy • Plot of (X – Y)2:

  25. Example: Confounded Effects for the Squared Difference for Autonomy • Results using (X – Y) and (X – Y)2: Dep Var: SAT N: 360 Multiple R: 0.364 Squared multiple R: 0.132 Adjusted squared multiple R: 0.127 Standard error of estimate: 1.074 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT 6.003 0.066 0.000 . 91.641 0.000 AUTALD 0.277 0.072 0.240 0.632 3.863 0.000 AUTSQD -0.097 0.037 -0.164 0.632 -2.646 0.008 Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression 62.660 2 31.330 27.185 0.000 Residual 411.426 357 1.152

  26. Example: Confounded Effects for the Squared Difference for Autonomy • Plot of (X – Y) and (X – Y)2:

  27. Problems with Difference Scores:Confounded Effects • Analogously, X and Y have been controlled in equations using |X – Y| as a predictor: Z = b0 + b1X + b2Y + b3|X – Y| + e • Controlling for X and Y does not provide a conservative test of |X – Y|. Rather, it alters the tilt of the V-shaped function indicated by |X – Y|.For example, if b1 = – b2 and b1 = b3, the left side is horizontal and the right side is positively sloped, and if b1 = –b2 and b2 = b3, the right side is horizontal and the left side is negatively sloped.

  28. Example: Confounded Effects for the Absolute Difference for Autonomy • Results using |X – Y|: Dep Var: SAT N: 360 Multiple R: 0.323 Squared multiple R: 0.105 Adjusted squared multiple R: 0.102 Standard error of estimate: 1.089 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT 6.212 0.087 0.000 . 71.122 0.000 AUTABD -0.531 0.082 -0.323 1.000 -6.464 0.000 Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression 49.555 1 49.555 41.788 0.000 Residual 424.532 358 1.186

  29. Example: Confounded Effects for the Squared Difference for Autonomy • Plot of |X – Y|:

  30. Example: Confounded Effects for the Squared Difference for Autonomy • Results using (X – Y) and |X – Y|: Dep Var: SAT N: 360 Multiple R: 0.374 Squared multiple R: 0.140 Adjusted squared multiple R: 0.135 Standard error of estimate: 1.069 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT 6.140 0.088 0.000 . 69.983 0.000 AUTALD 0.265 0.069 0.229 0.669 3.819 0.000 AUTABD -0.314 0.099 -0.191 0.669 -3.191 0.002 Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression 66.220 2 33.110 28.981 0.000 Residual 407.866 357 1.142

  31. Example: Confounded Effects for the Squared Difference for Autonomy • Plot of (X – Y) and |X – Y|:

  32. Problems with Difference Scores:Untested Constraints • Difference scores impose untested constraints on the coefficients relating X and Y to Z. • The constraints imposed by an algebraic difference can be seen with the following equations: Z = b0 + b1(X – Y) + e • Expansion yields: Z = b0 + b1X – b1Y + e

  33. Problems with Difference Scores:Untested Constraints • Now, consider an equation that uses X and Y as separate predictors: Z = b0 + b1X + b2Y + e • Comparing this equation to the previous equation shows that using (X – Y) as a predictor constrains the coefficients on X and Y to be equal in magnitude but opposite in sign (i.e., b1 = –b2, or b1 + b2 = 0). This constraint should not be imposed on the data but instead should be treated as a hypothesis to be tested.

  34. Example: Constrained and Unconstrained Algebraic Difference for Autonomy • Results using (X – Y): Dep Var: SAT N: 360 Multiple R: 0.339 Squared multiple R: 0.115 Adjusted squared multiple R: 0.113 Standard error of estimate: 1.082 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT 5.937 0.061 0.0 . 97.007 0.000 AUTALD 0.393 0.058 0.339 1.000 6.825 0.000 Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression 54.589 1 54.589 46.586 0.000 Residual 419.498 358 1.172

  35. Example: Constrained and Unconstrained Algebraic Difference for Autonomy • Results using X and Y: Dep Var: SAT N: 360 Multiple R: 0.356 Squared multiple R: 0.127 Adjusted squared multiple R: 0.122 Standard error of estimate: 1.077 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT 5.835 0.077 0.000 . 75.874 0.000 AUTCA 0.445 0.062 0.413 0.737 7.172 0.000 AUTCD -0.301 0.071 -0.244 0.737 -4.235 0.000 Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression 60.133 2 30.067 25.930 0.000 Residual 413.953 357 1.160

  36. Example: Constrained and Unconstrained Algebraic Difference for Autonomy • Constrained and unconstrained results: X Y R2 Constrained 0.39** -0.39** .12** Unconstrained 0.45** -0.30** .13**

  37. Problems with Difference Scores:Untested Constraints • The constraints imposed by an absolute difference can be seen using a piecewise linear equation: Z = b0 + b1 (1 – 2W)(X – Y) + e • When (X – Y) is positive or zero, W = 0, and the term (1 – 2W)(X – Y) becomes (X – Y). When (X – Y) is negative, W = 1, and (1 – 2W)(X – Y) equals –(X – Y). Thus, W switches the sign on (X – Y) only when it is negative, producing an absolute value transformation. • Expanding the equation yields: Z = b0 + b1X – b1Y – 2b1WX + 2b1WY + e

  38. Problems with Difference Scores:Untested Constraints • Now, consider a piecewise equation using X and Y: Z = b0 + b1X + b2Y + b3W + b4WX + b5WY + e • Comparing this equation to the previous equation shows that |X – Y| imposes four constraints: • b1 = –b2, or b1 + b2 = 0 • b4 = –b5, or b4 + b5 = 0 • b3 = 0 • b4 = –2b1, or 2b1 + b4 = 0 • These constraints should be treated as hypotheses to be tested empirically.

  39. Example: Constrained and Unconstrained Absolute Difference for Autonomy • Results using |X – Y|: Dep Var: SAT N: 360 Multiple R: 0.323 Squared multiple R: 0.105 Adjusted squared multiple R: 0.102 Standard error of estimate: 1.089 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT 6.212 0.087 0.000 . 71.122 0.000 AUTABD -0.531 0.082 -0.323 1.000 -6.464 0.000 Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression 49.555 1 49.555 41.788 0.000 Residual 424.532 358 1.186

  40. Example: Constrained and Unconstrained Absolute Difference for Autonomy • Results using X, Y, W, WX, and WY: Dep Var: SAT N: 360 Multiple R: 0.399 Squared multiple R: 0.159 Adjusted squared multiple R: 0.147 Standard error of estimate: 1.061 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT 6.233 0.152 0.000 . 41.136 0.000 AUTCA -0.150 0.184 -0.139 0.082 -0.818 0.414 AUTCD 0.183 0.188 0.148 0.102 0.970 0.333 AUTW -0.349 0.201 -0.148 0.329 -1.737 0.083 AUTCAW 0.752 0.209 0.490 0.129 3.605 0.000 AUTCDW -0.554 0.219 -0.406 0.093 -2.537 0.012 Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression 75.381 5 15.076 13.386 0.000 Residual 398.705 354 1.126

  41. Example: Constrained and Unconstrained Absolute Difference for Autonomy • Constrained and unconstrained results: X Y W WX WY R2 Constrained -0.53** 0.53** 0.00 1.06** -1.06** .11** Unconstrained -0.15 0.18 -0.35 0.75** -0.55** .16**

  42. Problems with Difference Scores:Untested Constraints • The constraints imposed by a squared difference can be seen with the following equations: Z = b0 + b1(X – Y)2 + e • Expansion yields: Z = b0 + b1X2 – 2b1XY + b1Y2 + e • Thus, a squared difference implicitly treats Z as a function of X2, XY, and Y2.

  43. Problems with Difference Scores:Untested Constraints • Now, consider a quadratic equation using X and Y: Z = b0 + b1X + b2Y + b3X2 + b4XY + b5Y2 + e • Comparing this equation to the previous equation shows that (X – Y)2 imposes four constraints: • b1 = 0 • b2 = 0 • b3 = b5, or b3 – b5 = 0 • b3 + b4 + b5 = 0 • Again, these constraints should be treated as hypotheses to be tested empirically, not simply imposed on the data.

  44. Example: Constrained and Unconstrained Squared Difference for Autonomy • Results using (X – Y)2: Dep Var: SAT N: 360 Multiple R: 0.310 Squared multiple R: 0.096 Adjusted squared multiple R: 0.093 Standard error of estimate: 1.094 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT 5.993 0.067 0.000 . 89.830 0.000 AUTSQD -0.183 0.030 -0.310 1.000 -6.162 0.000 Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression 45.463 1 45.463 37.972 0.000 Residual 428.623 358 1.197

  45. Example: Constrained and Unconstrained Squared Difference for Autonomy • Results using X, Y, X2, XY, and Y2: Dep Var: SAT N: 360 Multiple R: 0.411 Squared multiple R: 0.169 Adjusted squared multiple R: 0.157 Standard error of estimate: 1.055 Effect Coefficient Std Error Std Coef Tolerance t P(2 Tail) CONSTANT 5.825 0.083 0.000 . 70.161 0.000 AUTCA 0.197 0.100 0.182 0.273 1.966 0.050 AUTCD -0.293 0.106 -0.238 0.315 -2.754 0.006 AUTCA2 -0.056 0.047 -0.086 0.444 -1.177 0.240 AUTCAD 0.276 0.080 0.396 0.178 3.453 0.001 AUTCD2 -0.035 0.063 -0.054 0.242 -0.553 0.581 Analysis of Variance Source Sum-of-Squares df Mean-Square F-ratio P Regression 79.951 5 15.990 14.362 0.000 Residual 394.135 354 1.113

  46. Example: Constrained and Unconstrained Squared Difference for Autonomy • Constrained and unconstrained results: X Y X2XY Y2R2 Constrained 0.00 0.00 -0.18** 0.36** -0.18** .10** Unconstrained 0.20* -0.29** -0.06 0.28** -0.04 .17**

  47. Problems with Difference Scores:Dimensional Reduction • Difference scores reduce the three-dimensional relationship of X and Y with Z to two dimensions. • The linear algebraic difference function represents a symmetric plane with equal but opposite slopes with respect to the X-axis and Y-axis. • The U-shaped squared difference function represents a symmetric U-shaped surface with its minimum (or maximum) running along the X = Y line. • The V-shaped absolute difference function represents a symmetric V-shaped surface with its minimum (or maximum) running along the X = Y line.

  48. Two-Dimensional Algebraic Difference Function

  49. Three-Dimensional Algebraic Difference Function

  50. Two-Dimensional Absolute Difference Function

More Related