1 / 71

Embedded formative assessment: still more rhetoric than reality

Embedded formative assessment: still more rhetoric than reality. National Conference of The Schools Network 2011 Dylan Wiliam. www.dylanwiliam.net. Origins and antecedents. Feedback (Wiener, 1948) Developing range-finders for anti-aircraft guns

clarimonde
Télécharger la présentation

Embedded formative assessment: still more rhetoric than reality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Embedded formative assessment:still more rhetoric than reality National Conference of The Schools Network 2011 Dylan Wiliam www.dylanwiliam.net

  2. Origins and antecedents • Feedback (Wiener, 1948) • Developing range-finders for anti-aircraft guns • Effective action requires a closed system within which • Actions taken within the system are evaluated • Evaluation of the actions leads to modification of future actions • Two kinds of loops • Positive (bad: leads to collapse or explosive growth) • Negative (good: leads to stability) • “Feedback is information about the gap between the actual level and the reference level of a system parameter which is used to alter the gap in some way” (Ramaprasad, 1983 p. 4) • Feedback and instructional correctives (Bloom)

  3. What’s wrong with the feedback metaphor? In education In engineering • Feedback is any information given to the student about their current performance • … or at best, information that compares current performance with desired performance. • Much rarer is information that can be used by learners to improve • That’s just data • That’s just a thermostat • That’s a feedback system

  4. Feedback has complex effects • 264 low and high ability grade 6 students in 12 classes in 4 schools; analysis of 132 students at top and bottom of each class • Same teaching, same aims, same teachers, same classwork • Three kinds of feedback: scores, comments, scores+comments Butler(1988) Br. J. Educ. Psychol., 58 1-14

  5. Responses What do you think happened for the students given both scores and comments? • Gain: 30%; Attitude: all positive • Gain: 30%; Attitude: high scorers positive, low scorers negative • Gain: 0%; Attitude: all positive • Gain: 0%; Attitude: high scorers positive, low scorers negative • Something else

  6. Students and grades

  7. Feedback is not always effective • 200 grade 5 and 6 Israeli students • Divergent thinking tasks • 4 matched groups • experimental group 1 (EG1); comments • experimental group 2 (EG2); grades • experimental group 3 (EG3); praise • control group (CG); no feedback • Achievement • EG1>(EG2≈EG3≈CG) • Ego-involvement • (EG2≈EG3)>(EG1≈CG) Butler (1987) J. Educ. Psychol.79 474-482

  8. Feedback should feed forward • 80 Grade 8 Canadian students learning to write major scales in Music • Experimental group 1 (EG1) given • written praise • list of weaknesses • workplan • Experimental group 2 (EG2) given • oral feedback • nature of errors • chance to correct errors • Control group (CG1) given • no feedback • Achievement: EG2>(EG1≈CG) Bouletet al. (1990) J. Educational Research84 119-125

  9. …and should leave learning with the learner • ‘Peekability’ (Simmonds & Cope, 1993) • Pairs of students, aged 9-11 • Angle and rotation problems • class 1 worked on paper • class 2 worked on a computer, using Logo • Class 1 outperformed class 2 • ‘Scaffolding’ (Day & Cordón, 1993) • 2 grade 3 classes • class 1 given ‘scaffolded’ response • class 2 given solution when stuck • Class 1 outperformed class 2

  10. Effects of feedback • Kluger & DeNisi (1996) • Review of 3000 research reports • Excluding those: • without adequate controls • with poor design • with fewer than 10 participants • where performance was not measured • without details of effect sizes • left 131 reports, 607 effect sizes, involving 12652 individuals • On average feedback does improve performance, but • Effect sizes very different in different studies • 40% of effect sizes were negative

  11. Getting feedback right is hard

  12. Feedback practice audit How often do students receive ‘feedback’ in the form of scores, levels, sub-levels, or grades? • Key stages 1 to 3 • Key stage 4 • Key stage 5 • Every week • Every two or three weeks • Every month or half-term • Termly/twice a year • Annually

  13. Kinds of feedback (Nyquist, 2003) • Weaker feedback only • Knowledge or results (KoR) • Feedback only • KoR + clear goals or knowledge of correct results (KCR) • Weak formative assessment • KCR+ explanation (KCR+e) • Moderate formative assessment • (KCR+e) + specific actions for gap reduction • Strong formative assessment • (KCR+e) + activity

  14. Effects of formative assessment (HE)

  15. Feedback practice audit 2 In your school, what proportion of feedback events involve students in responding to the feedback provided immediately, and in class? • Less than 10% • 10% to 30% • 30% to 70% • 70% to 90% • More than 90%

  16. Unfortunately, humans are not machines… • Attribution (Dweck, 2000) • Personalization (internal v external) • Permanence (stable v unstable) • Essential that students attribute both failures and success to internal, unstable causes(it’s down to you, and you can do something about it)

  17. Mindset • Views of ‘ability’ • fixed (IQ) • incremental (untapped potential) • Essential that teachers inculcate in their students a view that ‘ability’ is incremental rather than fixed(by working, you’re getting smarter)

  18. What are the forces that will support or drive the adoption of formative assessment practices in your school/authority? What are the forces that will constrain or prevent the adoption of formative assessment practices in your school/authority? Force-field analysis (Lewin, 1954) + —

  19. “Flow” • A dancer describes how it fees when a performance is going well: “Your concentration is very complete. Your mind isn’t wandering, you are not thinking of something else; you are totally involved in what you are doing. … Your energy is flowing very smoothly. You feel relaxed, comfortable and energetic.” • A rock climber describes how it feels when he is scaling a mountain: “You are so involved in what you are doing [that] you aren’t thinking of yourself as separate from the immediate activity. … You don’t see yourself as separate from what you are doing.” • A mother who enjoys the time spent with her small daughter: “Her reading is the one thing she’s really into, and we read together. She reads to me and I read to her, and that’s a time when I sort of lose touch with the rest of the world, I’m totally absorbed in what I’m doing.” • A chess player tells of playing in a tournament: “… the concentration is like breathing—you never think of it. The roof could fall in and, if it missed you, you would be unaware of it.” (Csikszentmihalyi, 1990, pp. 53–54)

  20. high arousal Flow anxiety challenge control worry relaxation apathy boredom low low competence high Motivation: cause or effect? Csikszentmihalyi (1990)

  21. Providing feedback that moves learning on • Key idea: feedback should: • Cause thinking • Provide guidance on how to improve • Comment-only marking • Focused marking • Explicit reference to mark schemes/scoring guide • Suggestions on how to improve: • Not giving complete solutions • Re-timing assessment: • e.g., three-quarters-of-the-way-through-a-unit test

  22. A blossoming of research reviews… • Fuchs & Fuchs (1986) • Natriello (1987) • Crooks (1988) • Bangert-Drowns, et al. (1991) • Dempster (1991, 1992) • Elshout-Mohr (1994) • Kluger & DeNisi (1996) • Black & Wiliam (1998) • Nyquist (2003) • Brookhart (2004) • Allal & Lopez (2005) • Köller (2005) • Brookhart (2007) • Wiliam (2007) • Hattie & Timperley (2007) • Shute (2008)

  23. Effects of formative assessment Standardized effect size: differences in means, measured in population standard deviations

  24. Problems with effect sizes • Restriction of range • Sensitivity to instruction • Ambiguous comparisons

  25. Definitions of formative assessment We use the general term assessment to refer to all those activities undertaken by teachers—and by their students in assessing themselves—that provide information to be used as feedback to modify teaching and learning activities. Such assessment becomes formative assessment when the evidence is actually used to adapt the teaching to meet student needs” (Black & Wiliam, 1998 p. 140) “the process used by teachers and students to recognise and respond to student learning in order to enhance that learning, during the learning” (Cowie & Bell, 1999 p. 32) “assessment carried out during the instructional process for the purpose of improving teaching or learning” (Shepard et al., 2005 p. 275)

  26. “Formative assessment refers to frequent, interactive assessments of students’ progress and understanding to identify learning needs and adjust teaching appropriately” (Looney, 2005, p. 21) “A formative assessment is a tool that teachers use to measure student grasp of specific topics and skills they are teaching. It’s a ‘midstream’ tool to identify specific student misconceptions and mistakes while the material is being taught” (Kahl, 2005 p. 11)

  27. “Assessment for Learning is the process of seeking and interpreting evidence for use by learners and their teachers to decide where the learners are in their learning, where they need to go and how best to get there”(Broadfoot et al., 2002 pp. 2-3) Assessment for learning is any assessment for which the first priority in its design and practice is to serve the purpose of promoting students’ learning. It thus differs from assessment designed primarily to serve the purposes of accountability, or of ranking, or of certifying competence. An assessment activity can help learning if it provides information that teachers and their students can use as feedback in assessing themselves and one another and in modifying the teaching and learning activities in which they are engaged. Such assessment becomes “formative assessment” when the evidence is actually used to adapt the teaching work to meet learning needs. (Black et al., 2004 p. 10)

  28. Which of these is formative? • A science adviser uses test results to plan professional development workshops for teachers • Teachers doing item-by-item analysis of KS2 math tests to review their curriculum • A school tests students every 10 weeks to predict which students are “on course” to pass a big test • “Three fourths” of the way through a unit test • Exit pass question: “What is the difference between mass and weight?” • “Sketch the graph of y equals one over one plus x squared on your mini-dry-erase boards.”

  29. What does formative assessment form?

  30. Formative assessment: a new definition • “An assessment functions formatively to the extent that evidence about student achievement elicited by the assessment is interpreted and used to make decisions about the next steps in instruction that are likely to be better, or better founded, than the decisions that would have been taken in the absence of that evidence.” (Wiliam, 2009) • Formative assessment involves the creation of, and capitalization upon, moments of contingency in the regulation of learning processes.

  31. Unpacking formative assessment • Key processes • Establishing where the learners are in their learning • Establishing where they are going • Working out how to get there • Participants • Teachers • Peers • Learners

  32. Unpacking formative assessment Where the learner is going Where the learner is How to get there Providing feedback that moves learners forward Engineering effective discussions, tasks, and activities that elicit evidence of learning Teacher Clarifying, sharing and understanding learning intentions Peer Activating students as learning resources for one another Activating students as ownersof their own learning Learner

  33. Five “key strategies”… • Clarifying, sharing, and understanding learning intentions • curriculum philosophy • Engineering effective classroom discussions, tasks and activities that elicit evidence of learning • classroom discourse, interactive whole-class teaching • Providing feedback that moves learners forward • feedback • Activating students as learning resources for one another • collaborative learning, reciprocal teaching, peer-assessment • Activating students as owners of their own learning • metacognition, motivation, interest, attribution, self-assessment Wiliam & Thompson (2007)

  34. Unpacking formative assessment Where the learner is going Where the learner is How to get there Teacher Clarifying, sharing and understanding learning intentions Using evidence of achievement to adapt what happens in classrooms to meet learner needs Peer Learner

  35. Clarifying, sharing, and understanding learning intentions

  36. Sharing learning intentions • 3 teachers each teaching 4 Year 8 science classes in two US schools • 14 week experiment • 7 two-week projects, each scored 2-10 • All teaching the same, except: • For a part of each week • Two of each teacher’s classes discusses their likes and dislikes about the teaching (control) • The other two classes discusses how their work will be assessed White & Frederiksen, Cognition & Instruction, 16(1), 1998

  37. Sharing learning intentions

  38. Outcomes • Who will benefit most from the reflective assessment? • Higher achievers • Average achievers • Lower achievers • All students will benefit equally

  39. Sharing learning intentions

  40. Sharing learning intentions

  41. Sharing learning intentions • Explain learning intentions at start of lesson/unit: • Learning intentions • Success criteria • Consider providing learning intentions and success criteria in students’ language. • Use posters of key words to talk about learning: • e.g., describe, explain, evaluate • Use planning and writing frames judiciously • Use annotated examples of different standards to “flesh out” assessment rubrics (e.g., lab reports) • Provide opportunities for students to design their own tests

  42. Engineering effective discussions, activities, and classroom tasks that elicit evidence of learning

  43. Eliciting evidence • Key idea: questioning should • cause thinking • provide data that informs teaching • Improving teacher questioning • generating questions with colleagues • closed v open • low-order v high-order • appropriate wait-time

  44. Medicine Hat Tigers • A major junior (ice) hockey team playing in the Central Division of the Eastern Conference of the Western Hockey League in Canada • Players are aged from 15 to 20 • 15 year olds are only allowed to play five games until their own season has ended • Each team is allowed only three 20 year olds • Total roster 25 players

  45. Medicine Hat Tigers

  46. Eliciting evidence • Getting away from I-R-E • basketball rather than serial table-tennis • ‘No hands up’ (except to ask a question) • ‘Hot Seat’ questioning • All-student response systems • ABCD cards, Mini white-boards, Exit passes

  47. Nothing new under the sun…

  48. Eliciting evidence practice audit In what proportion of lessons in your school would a teacher use an ‘all student response’ system at least every 30 minutes? • Less than 10% • 10% to 30% • 30% to 70% • 70% to 90% • More than 90%

  49. Hinge questions • A hinge question is based on the important concept in a lesson that is critical for students to understand before you move on in the lesson. • The question should fall about midway during the lesson. • Every student must respond to the question within two minutes. • You must be able to collect and interpret the responses from all students in 30 seconds

More Related