1 / 17

General Linear Models -- #1

General Linear Models -- #1. things to remember b weight interpretations 1 quantitative predictor 1 2-group predictor 1 k-group predictor 1 quantitative & a 2-group predictors 1 quantitative & a k-group predictors 2 quantitative predictors 2x2 – main effects 2x2 with interactions

Télécharger la présentation

General Linear Models -- #1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. General Linear Models -- #1 • things to remember • b weight interpretations • 1 quantitative predictor • 1 2-group predictor • 1 k-group predictor • 1 quantitative & a 2-group predictors • 1 quantitative & a k-group predictors • 2 quantitative predictors • 2x2 – main effects • 2x2 with interactions • 2x3 – main effects • 2x3 with interactions

  2. A few important things to remember… • we plot and interpret the model of the data, not the data • if the model fits the data poorly, then we’re carefully describing and interpreting nonsense • the interpretation of regression weights in a main effects model (without interactions) is different than in a model including interactions • regression weights reflect “main effects” in a main effects model • regression weights reflect “simple effects” in a model including interactions

  3. b weight interpretations Constant the expected value of y when the value of all predictors = 0 Centered quantitative variable the direction and extent of the expected change in the value of y for a 1-unit increase in that predictor, holding the value of all other predictors constant at 0 Dummy Coded binary variable the direction and extent of expected mean difference of the Target group from the Comparison group, holding the value of all other predictors constant Dummy Coded k-group variable the direction and extent of the expected mean difference of the Target group for that dummy code from the Comparison group, holding the value of all other predictors constant.

  4. b weight interpretations Interaction between quantitative variables the direction and extent of the expected change in the slope of the linear relationship between y and one predictor for each 1-unit change in the other predictor, holding the value or all other predictors constant at 0 Interaction between quantitative & Dummy Coded binary variables the direction and extent of expected change in the slope of the linear relationship between y and the quantitative variable of the Target group from the slope of the Comparison group, holding the value of all other predictors constant at 0 Interaction between quantitative & Dummy Coded k-group variables the direction and extent of expected change in the slope of the linear relationship between y and the quantitative variable of the Target groupfor that dummy code from the slope of the Comparison group, holding the value of all other predictors constant at 0 Interaction between Dummy Coded Variables the direction and extend of expected change in the mean difference between the IVx Target & Comparison groups of the IVz Target group from mean difference between the IVx Target & Comparison groups of the IVz Comparison group, holding the value of all other predictors constant at 0.

  5. Single quantitative predictor (X)  Bivariate Regression X = X – Xmean y’ = b0 + b1 X b0 = ht of line b1 = slp of line 0 10 20 30 40 50 60 b1 b0 -20 -10 0 10 20  X

  6. 2-group predictor(Tx Cx)  2-grp ANOVA X Tx = 1 Cx = 0 X = Tx vs. Cx y’ = b0 + b1X b0 = ht Cx b1 = htdif Cx & Tx Tx 0 10 20 30 40 50 60 b1 Cx b0

  7. 3-group predictor (Tx1 Tx2 Cx)  k-grp ANOVA X1 Tx1=1 Tx2=0 Cx=0 X2 Tx1=0 Tx2=1 Cx=0 X1 = Tx1 vs. Cx X2 = Tx2 vs. Cx y’ = b0 + b1X1 + b2X2 b0 = ht Cx Tx1 b1 = htdif Cx & Tx1 b1 b2 = htdif Cx & Tx2 Cx 0 10 20 30 40 50 60 b0 b2 Tx2

  8. quantitative (X) & 2-group (Tz Cz) predictors  2-grp ANCOVA Z Tz = 1 Cz = 0 X = X – Xmean Z = Tz vs. Cz y’ = b0 + b1X + b2Z b0 = ht of Cz line b1 = slp of Cz line b2 = htdif Cz & Tz 0 10 20 30 40 50 60 Tz b1 b2 Z-lines all have same slp (no interaction) b0 Cz -20 -10 0 10 20  X

  9. quantitative (X) & 2-group (Tz Cz) predictors w/ interaction Z Tz = 1 Cz = 0 X = X – Xmean XZ = Xcen * Z Z = Tz vs. Cz y’ = b0 + b1X + b2Z + b3XZ b0 = ht of Cz line b3 b1 = slp of Cz line b2 = htdif Cz & Tz 0 10 20 30 40 50 60 b3 = slpdif Cz & Tz b1 b2 Tz b0 Cz -20 -10 0 10 20  Xcen

  10. quantitative (X) & 3-group (Tz1 Tz2 Cz) predictors  3-grp ANCOVA X = X – Xmean Z1 Tz1=1 Tz2=0 Cx=0 Z2 Tz1=0 Tx2=1 Cx=0 Z1 = Tz1 vs. Cz Z2 = Tz2 vs. Cz y’ = b0 +b1X+ b2Z1 + b3Z2 b0 = ht of Cz line b1 = slp of Cz line b2 Tz1 b2 = htdif Cz & Tz1 0 10 20 30 40 50 60 b1 Cz b3 = htdif Cz & Tz2 b3 Tz2 Z-lines all have same slp (no interaction) b0 -20 -10 0 10 20  X

  11. Models with quant (X) & 3-group (Tz1 Tz2 Cz) predictors w/ interaction Z1 Tz1=1 Tz2=0 Cx=0 Z2 Tz1=0 Tx2=1 Cx=0 XZ1 = Xcen * Z1 X = X – Xmean XZ2 = Xcen * Z2 Z1 = Tz1 vs. Cz Z2 = Tz2 vs. Cz y’ = b0 + b1Xcen + b2Z1 + b3Z2 + b4XZ1 + b5XZ2 b4 b0 = ht of Cz line b1 = slp of Cz line b1 b2 b2 = htdif Cz & Tz1 Tx2 b3 0 10 20 30 40 50 60 b5 b4 = slpdif Cz & Tz1 Tx1 Cx b3 = htdif Cz & Tz2 b0 b5 = slpdif Cz & Tz2 -20 -10 0 10 20  Xcen

  12. 2 quantitative predictors  multiple regression X = X – Xmean Z = Z – Zmean y’ = b0 + b1X + b2Z b0 = ht of Zmean line b1 = slope of Zmean line +1std Z b2 b2 = htdifs among Z-lines 0 10 20 30 40 50 60 Z=0 b1 b2 -1std Z Z-lines all have same slp (no interaction) b0 -20 -10 0 10 20  X

  13. 2 quantitative predictors w/ interaction Xcen = X – Xmean Zcen = Z – Xmean ZX = Xcen * Zcen y’ = b0 + b1Xcen + b2Zcen + b3XZ a = ht of Zmean line b3 b1 = slope of Zmean line b1 b2 = htdifs among Z-lines b2 +1std Z b3 0 10 20 30 40 50 60 b3 = slpdifs among Z-lines -b2 Z=0 -1std Z b0 -20 -10 0 10 20  Xcen

  14. 2-group (Tx Cx) & 2-group (Tz Cz) predictors Main Effects Model Z Tz = 1 Cz = 0 X Tx = 1 Cx = 0 X C T X = Tx vs. Cx Z = Tz vs. Cz TxTz CxTz T Z C TxCz CxCz y’ = b0 + b1X + b2Z TxTz b0 = mean of CxCz CxTz b1 = htdif of CxCz & TxCz TxCz b2 0 10 20 30 40 50 60 b1 b2 = htdifs of CxCz & CxTz CxCz b0 = simple effects (no interaction) 0 1  X

  15. Models with 2-group (Tx Cx) & 2-group (Tz Cz) predictors  2x2 ANOVA Z Tz = 1 Cz = 0 XZ = X * Z X Tx = 1 Cx = 0 X = Tx vs. Cx Z = Tz vs. Cz y’ = b0 + b1X + b2Z + b3XZ b0 = mean of CxCz b1 = htdif of CxCz & TxCz CxTz b3 TxTz vs. b2 = htdifs of CxCz & CxTz b2 0 10 20 30 40 50 60 TxCz b1 b3 = dif htdifs of CxCz - TxCz & CxTz - TxTz CxCz b0 0 1  X

  16. Z C T1 T2 Models with 2-group (Tx Cx) & 3-group (Tz1 Tz2 Cz) predictors  ME model CxTz2 CxTz1 CxCz C X T TxCz TxTz1 TxTz2 X Tx = 1 Cx = 0 Z1 Tz1=1 Tz2=0 Cx=0 Z2 Tz1=0 Tx2=1 Cx=0 Z1 = Tz1 vs. Cz Z2 = Tz2 vs. Cz X = Tx vs. Cx y’ = b0 + b1X + b2Z1 + b3Z2 b0 = mean of CxCz TxTz2 b1 = htdif of CxCz & TxCz TxTz1 CxTz2 b3 b2 = htdifs of CxCz & CxTz1 CxTz1 TxCz 0 10 20 30 40 50 60 b1 b2 b3 = htdifs of CxCz & CxTz2 CxCz = simple effects (no interaction) b0 0 1  X

  17. Models with 2-group (Tx Cx) & 3-group (Tz1 Tz2 Cz) predictors  2x3 ANOVA X Tx = 1 Cx = 0 Z1 Tz1=1 Tz2=0 Cx=0 Z2 Tz1=0 Tx2=1 Cx=0 XZ1 = X * Z1 Z1 = Tz1 vs. Cz Z2 = Tz2 vs. Cz XZ2 = X * Z2 X = Tx vs. Cx b0 = mean of CxCz y’ = b0 + b1X + b2Z1 + b3Z2 +b4XZ1 + b5XZ2 b1 = htdif of CxCz & TxCz TxTz2 b2 = htdifs of CxCz & CxTz1 b5 CxTz2 vs. b3 = htdifs of CxCz & CxTz2 b3 TxTz1 b4 CxTz1 TxCz vs. 0 10 20 30 40 50 60 b4 = dif htdifs of CxCz - TxCz & CxTz1 – TxTz1 b2 b1 CxCz b5 = dif htdifs of CxCz - TxCz & CxTz2 – TxTz2 b0 0 1  X

More Related