1 / 35

Using Differential Item Functioning Analyses to Enhance the Curriculum

Using Differential Item Functioning Analyses to Enhance the Curriculum. Dr Juho Looveer ACSPRII Sydney December 2006.

hinda
Télécharger la présentation

Using Differential Item Functioning Analyses to Enhance the Curriculum

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Differential Item Functioning Analyses to Enhance the Curriculum Dr Juho Looveer ACSPRII Sydney December 2006

  2. Using Modern Psychometric Theoryto Identify Differential Item Functioningin Polytomously ScoredConstructed Response ItemsLinking Results from Differential Item Functioning Analyses to the Curriculum

  3. BIAS • Where one group has an unfair advantage over another “Educational or psychological tests are biased if the test scores of equally able test takers are systematically different between racial, ethnic, cultural, and other similar sub-groups.” (Kelderman, 1989, p. 681) “When a test item unfairly favours one group of students compared to another, the item is biased.” (Gierl, Rogers and Klinger, 1999, p. 2)

  4. Impact • Where one group performs differently than another group “a between-group difference in test performance caused by group ability differences on the valid skill (e.g., the differences between the proportion correct for two groups of interest on a valid item). (Ackerman, 1994, p. 109)

  5. Differential Functioning - Differential Item Functioning (DIF) “When persons from one group answer an item correctly more often than equally knowledgeable persons from another group, the item exhibits DIF.” (Ackerman, 1994, p. 142) “DIF refers to differences in item functioning after groups have been matched with respect to the ability or attribute that the item purportedly measures.” (Dorans and Holland, 1993, p. 37 )

  6. Previous Methodology (1) • Test Level • Comparing Group Means • Meta Analyses • Item Level • Correlations • ANOVA • Factor Analyses • Other multivariate techniques • Most studies were based on unmatched samples

  7. Previous Methodology (2) • Item Level with matched samples • Transformed Item Difficulty Index (TID-DIF) - Angoff 1972; 1982) • Contingency Table methods - eg Standardisation Method (Dorans & Kulick, 1986) • Chi-Square methods - eg Mantel-Haenszel (Holland-Thayer, 1988) • Logistic Regression

  8. Previous Methodology (3) Previous methods used the simple sum of scores as a measure of ability. With classical test theory . . . “. . . perhaps the most important shortcoming is that examinee characteristics and test characteristics cannot be separated: each can be interpreted only in the context of the other. “ (Hambleton, Swaminathan and Rogers, 1991, p. 2) The results from one test can not be directly compared to the results from another test or another group of examinees.

  9. Item level methods with students matched on ability (IRT/Rasch) • Comparative plots • Simple parameter designs • Model comparison measures • Item Characteristic Curves (ICCs) • Area between ICCs

  10. Methods for Identifying DIF in Polytomous Items • Group means (Garner & Engelhard, 1999); • Standardised mean differences, correlations and covariance analyses (Pomplun and Capps, 1999); • Factor analysis (Wang, 1998); • Logistic discriminant function analysis (Hamilton & Snow, 1998; Miller & Spray, 1993); • Polynomial loglinear model (Hanson & Feinstein, 1997); • Mantel-Haenszel test and the generalised Mantel-Haenszel test (Cohen, Kim, & Wollack, 1998; Hamilton & Snow, 1998; Henderson, 2001); • Lord’s (1980) Chi-Square test (Cohen, Kim, & Wollack, 1998); • likelihood ratio test (Cohen, Kim, & Wollack, 1998; Kim, Cohen, Di Stefano & Kim, 1998); • Separate calibration and comparative plots, and between-fit (Smith, 1994; Smith 1996); • Poly-SIBTEST (Chang, Mazzeo & Roussos, 1996; Henderson, 2001; Zwick, Theyer & Mazzeo, 1997); and • Raju’s (1988; 1990) signed and unsigned area tests.

  11. Using RUMM to Identify DIF in Polytomous Items (1) • RUMM produces separate Expected Value Curves (EVCs) for each group being considered • EVCs are based on mean scores for sub-groups (from the actual data) partitioned according to ability

  12. Using RUMM to Identify DIF in Polytomous Items (2) Using the same data points as the EVCs are based on, RUMM calculates an Analysis of Variance (ANOVA) to assess the amount of DIF Expected Value Curves Showing DIF for Gender

  13. Using RUMM to Identify DIF in Polytomous Items (3) Extract from Analysis of Variance Results for DIF – ITEM 1 [I0001:Q21a i]

  14. Context of this study • New South WalesHigher School Certificate (HSC) • Mathematics in Society (MIS) examination • N= 2630 (from a total of 22,828 candidates in MIS)1130 males, 1481 females

  15. MIS Exam paper

  16. Example of a Question in MIS

  17. Classifying Mathematical Skills, Knowledge and Understandings

  18. Sample of identifying Skills necessary for deriving correct answers

  19. Number Of Students Attempting Each Question

  20. Analyses of Data • Data for 71 items were analysed using RUMM 2010 • Item locations ranged from -2.909 to +2.246 • 9 Items showed poor fit to the model (based on residuals and chi-square values)

  21. EVCs for Item with DIF

  22. EVCs for Item with no DIF

  23. Using EVCs to understand Item Functioning (1)

  24. Using EVCs to understand Item Functioning (2)

  25. Using EVCs to understand Item Functioning (3)

  26. Summary of ANOVA for Items exhibiting DIF

  27. Part Questions Exhibiting DIF by Content Area

  28. Topic Areas Which Appear Easier By Gender

  29. Skills According To DIF

  30. Items Involving the Skill Of Substituting

  31. Future Directions . . . • Verifying actual results: • Are these results consistent? • Across time & other cohorts • Across other mathematics courses • Across other states • What are the causes of the DIF? • When and where do these differences first appear? • Are they due to teaching strategy or inherent weaknesses / differences?

  32. General Comments • Identification of items where DIF is evident can be linked to actual curriculum areas. • Identifying skills which lead to DIF can indicate where students need more support. • Methodology demonstrated can be used for polytomous items and constructed response items in any subject area

  33. ? ? ? ? ? ? ? Questions . . .

  34. Comparative Plot for DIF

More Related