1 / 20

Diagnostic testing that just might make a difference

Diagnostic testing that just might make a difference. Dylan Wiliam Institute of Education, University of London. www.dylanwiliam.net. Diagnostic Items in Mathematics and Science: Project rationale. Traditional testing deals with individuals, but teachers mostly deal with groups

dom
Télécharger la présentation

Diagnostic testing that just might make a difference

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Diagnostic testing that just might make a difference Dylan Wiliam Institute of Education, University of London www.dylanwiliam.net

  2. Diagnostic Items in Mathematics and Science: Project rationale • Traditional testing deals with individuals, but teachers mostly deal with groups • Data-Push vs. Decision-Pull • Data-push • Quality control at end of an instructional sequence • Monitoring assessment • Identifies that remediation is required, but not what • Requires new routines to utilize the information • Decision-Pull • Starts with the decisions teacher make daily • Supports teachers “on-the-fly” decisions • If a 30-item test provides useful information on an individual, then responses from 30 individuals on a single item might provide useful information on a class

  3. Premises of the DIMS project • Single, well-developed, questions provide a reasonably strong basis for real-time instructional decision-making • Questions with principled and interpretable incorrect answers can be used to elicit evidence of student thinking • Such questions, when used with all-student response systems, and when followed with discussion, can advance student learning • Such questions can deepen teacher knowledge (both content and content-pedagogical)

  4. The DIMS project in practice • Focuses on 4th & 8th grade, mathematics & science • Includes a bank of approximately 150 diagnostic questions per subject/grade level • Question bank is a complementary resource for any curriculum, not a curriculum replacement • Items to be used one at a time, within the flow of day-to-day lessons • Focuses on collecting evidence about student learning in order to adapt instruction to meet the learning needs of students in real time

  5. Item development • Review of state standards • Review of relevant literature on student conceptions • Identification of significant (mis-)conceptions • Item construction • appears to be very difficult for many item-writers • a distractor-stem-key approach worked for some • Item review and editing • Expert review • Piloting • Final review and editing

  6. DIMS meets traditional psychometrics • Concept-based distractors increase difficulty • Most IRT models assume • Monotonicity • Equivalence of incorrect responses • Need for analysis of single items makes most theories difficult to apply, or irrelevant

  7. Semi-dense items(Bart, Post, Lesh & Behr, 1994) Five properties • Response interpretability • Response discrimination • Rule discrimination • Exhaustive set usage • Semi-density

  8. Response interpretability Each response interpretable by at least one cognitive rule

  9. Response discrimination Each response interpretable by exactly one cognitive rule

  10. Rule discrimination I Response discrimination + uniqueness of cognitive rules that interpret a response

  11. Exhaustive rule set usage Response discrimination + every cognitive rule interprets at least one response to the item.

  12. Semi-density Exactly one cognitive rule interprets each response to the item and each cognitive rule interprets exactly one response

  13. Version 1 If e+f = 8, then e+f+g =  9 12 15 8+g Version 2 If f+g = 8, then f+g+h =  9 12 15 16 8+h Discriminating incorrect cognitive rules (Hart, 1981)

  14. Version 1 12 24 Version 2 24 cents, because 8×3=24 or 8 pieces × 3 cents per piece=24 cents. 24 cents, because 4×6=24. 24 cents, because 2/6=8/x and 2x=48 for which x=24. 24 cents, because 2/6=8/x and 2/6x4/4=8/24. 24 cents, because 2/6=4/12=6/18=8/24. 24 cents, became 2/6=4/12=8/24. 12 cents, because 2+4=6 implies that 8+4=12 Discriminating correct cognitive rules (Bart et al., 1994) Ann and Kathy each bought the same kind of bubble gum at the same store. Ann bought two pieces of gum for six cents. If Kathy bought eight pieces of gum, how much did she pay?

  15. Version 1 There are two flights per day from Newtown to Oldtown. The first flight leaves Newtown each day at 9:20 and arrives in Oldtown at 10:55. The second flight from Newtown leaves at 2:15. At what time does the second flight arrive in Oldtown? Show your work. Version 2 There are two flights per day from Newtown to Oldtown. The first flight leaves Newtown each day at 9:05 and arrives in Oldtown at 10:55. The second flight from Newtown leaves at 2:15. At what time does the second flight arrive in Oldtown? Show your work. Discriminating between incorrect and correct cognitive rules

  16. A B C D A B C D Discriminating between incorrect and correct cognitive rules (2)

  17. Conclusion Correct Incorrect

  18. Conclusion (2) • For an item to support instructional decision-making, the key requirement is that in no case do incorrect and correct cognitive rules map on to the same response. • If this property is met, then the semi-density properties are less important. • If this property is not met, then the semi-density properties are irrelevant. • The discovery of new incorrect cognitive rules that interpret item keys leads to item improvement

  19. Conclusion (3) • Data on impact on student achievement not yet available • Evidence of impact on • Teachers’ practice • Student engagement • Student affect

  20. Questions?Comments?

More Related