1 / 75

Using assessment to support learning: why, what and how?

Using assessment to support learning: why, what and how?. Dylan Wiliam, Institute of Education www.dylanwiliam.net. Overview of presentation. Why raising achievement is important Why investing in teachers is the answer Why assessment for learning should be the focus

yagil
Télécharger la présentation

Using assessment to support learning: why, what and how?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using assessment to support learning: why, what and how? Dylan Wiliam, Institute of Education www.dylanwiliam.net

  2. Overview of presentation • Why raising achievement is important • Why investing in teachers is the answer • Why assessment for learning should be the focus • Why teacher learning communities should be the mechanism • How we can put this into practice

  3. Raising achievement matters • For individuals • Increased lifetime salary • Improved health • For society • Lower criminal justice costs • Lower health-care costs • Increased economic growth

  4. Where’s the solution? • Structure • Small high schools • K-8 schools • Alignment • Curriculum reform • Textbook replacement • Governance • Charter schools • Vouchers • Technology

  5. School effectiveness • 3 generations of effectiveness research • Raw results approaches • Different schools get different results • Conclusion: Schools make a difference • Demographic-based approaches • Demographic factors account for most of the variation • Conclusion: Schools don’t make a difference • Value-added approaches • School-level differences in value-added are relatively small • Classroom-level differences in value-added are large • Conclusion: An effective school is little more than a school full of effective classrooms

  6. It’s the classroom • Variability at the classroom level is up to 4 times greater than at school level • It’s not class size • It’s not the between-class grouping strategy • It’s not the within-class grouping strategy • It’s the teacher

  7. Teacher quality • A labor force issue with 2 solutions • Replace existing teachers with better ones? • No evidence that more pay brings in better teachers • No evidence that there are better teachers out there deterred by certification requirements • Improve the effectiveness of existing teachers • The “love the one you’re with” strategy • It can be done • We know how to do it, but at scale? Quickly? Sustainably?

  8. Cost/effect comparisons

  9. Learning power environments • Key concept: • Teachers do not create learning • Learners create learning • Teaching as engineering learning environments • Key features: • Create student engagement (pedagogies of engagement) • Well-regulated (pedagogies of contingency)

  10. Why pedagogies of engagement? • Intelligence is partly inherited • So what? • Intelligence is partly environmental • Environment creates intelligence • Intelligence creates environment • Learning environments • High cognitive demand • Inclusive • Obligatory

  11. Why pedagogies of contingency? • Several major reviews of the research • Natriello (1987) • Crooks (1988) • Kluger & DeNisi (1996) • Black & Wiliam (1998) • Nyquist (2003) • All find consistent, substantial effects

  12. Types of formative assessment • Long-cycle • Span: across units, terms • Length: four weeks to one year • Medium-cycle • Span: within and between teaching units • Length: one to four weeks • Short-cycle • Span: within and between lessons • Length: • day-by-day: 24 to 48 hours • minute-by-minute: 5 seconds to 2 hours

  13. Effects of formative assessment • Long-cycle • Student monitoring • Curriculum alignment • Medium-cycle • Improved, student-involved, assessment • Improved teacher cognition about learning • Short-cycle • Improved classroom practice • Improved student engagement

  14. Kinds of feedback (Nyquist, 2003) • Weaker feedback only • Knowledge of results (KoR) • Feedback only • KoR + clear goals or knowledge of correct results (KCR) • Weak formative assessment • KCR+ explanation (KCR+e) • Moderate formative assessment • (KCR+e) + specific actions for gap reduction • Strong formative assessment • (KCR+e) + activity

  15. Effect of formative assessment (HE)

  16. Formative assessment • Classroom assessment is not (necessarily) formative assessment • Formative assessment is not (necessarily) classroom assessment

  17. Formative assessment Assessment for learning is any assessment for which the first priority in its design and practice is to serve the purpose of promoting pupils’ learning. It thus differs from assessment designed primarily to serve the purposes of accountability, or of ranking, or of certifying competence. An assessment activity can help learning if it provides information to be used as feedback, by teachers, and by their pupils, in assessing themselves and each other, to modify the teaching and learning activities in which they are engaged. Such assessment becomes ‘formative assessment’ when the evidence is actually used to adapt the teaching work to meet learning needs. Black et al., 2002

  18. Feedback and formative assessment “Feedback is information about the gap between the actual level and the reference level of a system parameter which is used to alter the gap in some way” (Ramaprasad, 1983 p. 4) • Three key instructional processes • Establishing where learners are in their learning • Establishing where they are going • Establishing how to get there

  19. Aspects of formative assessment

  20. Five Key Strategies …

  21. …and one big idea • Use evidence about learning to adapt instruction to meet student needs

  22. Keeping Learning on Track (KLT) • A pilot guides a plane or boat toward its destination by taking constant readings and making careful adjustments in response to wind, currents, weather, etc. • A KLT teacher does the same: • Plans a carefully chosen route ahead of time (in essence building the track) • Takes readings along the way • Changes course as conditions dictate

  23. Questioning

  24. Kinds of questions: Israel Which fraction is the smallest? Success rate 88% Which fraction is the largest? Success rate 46%; 39% chose (b) [Vinner, PME conference, Lahti, Finland, 1997]

  25. Misconceptions

  26. Misconceptions 3a = 24 a + b = 16

  27. Molecular structure of water?

  28. Feedback

  29. Kinds of feedback: Israel • 264 low and high ability grade 6 students in 12 classes in 4 schools; analysis of 132 students at top and bottom of each class • Same teaching, same aims, same teachers, same classwork • Three kinds of feedback: scores, comments, scores+comments Feedback Gain Attitude scores none top +ve bottom -ve comments 30% all +ve [Butler(1988) Br. J. Educ. Psychol., 58 1-14]

  30. Responses FeedbackGain Attitude scores none top +ve bottom -ve comments 30% all +ve What do you think happened for the students given both scores and comments: A: Gain: 30%; Attitude: all +ve B: Gain: 30%; Attitude: top +ve, bottom -ve C: Gain: 0%; Attitude: all +ve D: Gain: 0%; Attitude: top +ve, bottom -ve E: Something else [Butler(1988) Br. J. Educ. Psychol., 58 1-14]

  31. Kinds of feedback: Israel (2) • 200 grade 5 and 6 Israeli students • Divergent thinking tasks • 4 matched groups • experimental group 1 (EG1); comments • experimental group 2 (EG2); grades • experimental group 3 (EG3); praise • control group (CG); no feedback • Achievement • EG1>(EG2≈EG3≈CG) • Ego-involvement • (EG2≈EG3)>(EG1≈CG) [Butler (1987) J. Educ. Psychol.79 474-482]

  32. Effects of feedback • Kluger & DeNisi (1996) • Review of 3000 research reports • Excluding those: • without adequate controls • with poor design • with fewer than 10 participants • where performance was not measured • without details of effect sizes • left 131 reports, 607 effect sizes, involving 12652 individuals • Average effect size 0.4, but • Effect sizes very variable • 40% of effect sizes were negative

  33. Feedback • Formative assessment requires • data on the actual level of some measurable attribute; • data on the reference level of that attribute; • a mechanism for comparing the two levels and generating information about the ‘gap’ between the two levels; • a mechanism by which the information can be used to alter the gap. • Feedback is therefore formative only if the information fed back is actually used in closing the gap.

  34. Formative assessment • Frequent feedback is not necessarily formative • Feedback that causes improvement is not necessarily formative • Assessment is formative only if the information fed back to the learner is used by the learner in making improvements • To be formative, assessment must include a recipe for future action

  35. How do students make sense of this? • Attribution (Dweck, 2000) • Personalization (internal v external) • Permanence (stable v unstable) • Essential that students attribute both failures and success to internal, unstable causes. (It’s down to you, and you can do something about it.) • Views of ‘ability’ • Fixed (IQ) • Incremental (untapped potential) • Essential that teachers inculcate in their students a view that ‘ability’ is incremental rather than fixed(by working, you’re getting smarter)

  36. Sharing learning intentions

  37. Sharing criteria with learners • 3 teachers each teaching 4 grade 7 science classes in two US schools • 14 week experiment • 7 two-week projects, scored 2-10 • All teaching the same, except: • For a part of each week • Two of each teacher’s classes discusses their likes and dislikes about the teaching (control) • The other two classes discusses how their work will be assessed [Frederiksen & White, AERA conference, Chicago, 1997]

  38. Iowa Test of Basic Skills Group Low Middle High Likes and dislikes 4.6 5.9 6.6 Reflective assessment 6.7 7.2 7.4 Sharing criteria with learners

  39. Peer- and self-assessment

  40. Self-assessment: Portugal • Teachers studying for MA in Education • Group 1 do regular programme • Group 2 work on self-assessment for 2 terms (20 weeks) • Teachers matched in age, qualifications and experience using the same curriculum scheme for the same amount of time • Pupils tested at beginning of year, and again after two terms • Group 1 pupils improve by 7.8 points • Group 2 pupils improve by 15 [Fontana & Fernandez, Br. J. Educ. Psychol. 64: 407-417]

  41. Putting it into practice

  42. Eliciting evidence of student achievement by engineering effective classroom discussions, questions and learning tasks

  43. Practical techniques questioning • Key idea: questioning should • cause thinking • provide data that informs teaching • Improving teacher questioning • generating questions with colleagues • closed v open • low-order v high-order • appropriate wait-time • Getting away from I-R-E • basketball rather than serial table-tennis • ‘No hands up’ (except to ask a question) • class polls to review current attitudes towards an issue • ‘Hot Seat’ questioning • All-student response systems • ABCD cards, Mini white-boards, Exit passes

  44. Questioning in math: discussion Look at the following sequence: 3, 7, 11, 15, 19, …. Which is the best rule to describe the sequence? • n + 4 • 3 + n • 4n - 1 • 4n + 3

  45. b c A B a a c b a c C D b b c a a b E F c c b a Questioning in math: diagnosis In which of these right triangles is a2 + b2 = c2 ?

  46. Questioning in science: discussion Ice-cubes are added to a glass of water. What happens to the level of the water as the ice-cubes melt? • The level of the water drops • The level of the water stays the same • The level of the water increases • You need more information to be sure

  47. Questioning in science: diagnosis The ball sitting on the table is not moving. It is not moving because: • no forces are pushing or pulling on the ball. • gravity is pulling down, but the table is in the way. • the table pushes up with the same force that gravity pulls down • gravity is holding it onto the table. • there is a force inside the ball keeping it from rolling off the table Wilson & Draney, 2004

  48. Dinosaurs extinction Why did dinosaurs become extinct? A) Humans destroyed their habitat B) Humans killed them all for food C) There was a major change in climate

More Related