1 / 22

Research perspectives and formative assessment

Research perspectives and formative assessment. ASME Conference: Researching Medical Education, November 2009: RIBA, London Dylan Wiliam. Overview. The nature of educational research What should educational research try to do? How should it try to do it? Formative assessment Definitions

tinsleys
Télécharger la présentation

Research perspectives and formative assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research perspectives and formative assessment ASME Conference: Researching Medical Education, November 2009: RIBA, London Dylan Wiliam

  2. Overview • The nature of educational research • What should educational research try to do? • How should it try to do it? • Formative assessment • Definitions • Implementations • Researching formative assessment

  3. Pasteur’s quadrant

  4. Educational research • “An elusive science” (Lagemann, 2000) • A search for disciplinary foundations • Making social science matter (Flyvbjerg, 2001) • Contrast between analytic rationality and value-rationality • Physical science succeeds when it focuses on analytic rationality • Social science • fails when it focuses on analytic rationality, but • succeeds when it focuses on value-rationality

  5. Research methods 101: causality • Does X cause Y? • In the presence of X, Y happened (factual) • Problem: post hoc ergo propter hoc • Desired inference: If X had not happened, Y would not have happened (counterfactual) • Problem: X did happen • So we need to create a parallel world where X did not happen • Same group different time (baseline measurement) • Need to assume stability over time • Different group same time (control group) • Need to assume groups are equivalent • Randomized contolled trial

  6. Plausible rival hypotheses • Example: Smoking cigarettes causes lung cancer • Randomized controlled trial not possible • Have to rely on other methods • Logic of inference-making • Establish the warrant for chosen inferences • Establish that plausible rival interpretations are less warranted

  7. Knowledge • Not justified-true-belief • Discriminability (Goldman, 1976) • Elimination of plausible rival hypotheses • Building knowledge involves: • marshalling evidence to support the desired inference • eliminating plausible rival interpretations • ‘Plausible’ determined by reference to a theory, a community of practice, or a dominant discourse

  8. Inquiry systems (Churchman, 1971) • System Evidence • Leibnizian Rationality • Lockean Observation • Kantian Representation • Hegelian Dialectic • Singerian Values, ethics and practical consequences

  9. Inquiry systems The Lockean inquirer displays the ‘fundamental’ data that all experts agree are accurate and relevant, and then builds a consistent story out of these. The Kantian inquirer displays the same story from different points of view, emphasising thereby that what is put into the story by the internal mode of representation is not given from the outside. But the Hegelian inquirer, using the same data, tells two stories, one supporting the most prominent policy on one side, the other supporting the most promising story on the other side (Churchman, 1971 p. 177).

  10. Singerian inquiry systems The ‘is taken to be’ is a self-imposed imperative of the community. Taken in the context of the whole Singerian theory of inquiry and progress, the imperative has the status of an ethical judgment. That is, the community judges that to accept its instruction is to bring about a suitable tactic or strategy [...]. The acceptance may lead to social actions outside of inquiry, or to new kinds of inquiry, or whatever. Part of the community’s judgement is concerned with the appropriateness of these actions from an ethical point of view. Hence the linguistic puzzle which bothered some empiricists—how the inquiring system can pass linguistically from “is” statements to “ought” statements— is no puzzle at all in the Singerian inquirer: the inquiring system speaks exclusively in the “ought,” the “is” being only a convenient façon de parler when one wants to block out the uncertainty in the discourse. (Churchman, 1971: 202).

  11. Educational research • …can be characterised as a never-ending process of assembling evidence that: • particular inferences are warranted on the basis of the available evidence; • such inferences are more warranted than plausible rival inferences; • the consequences of such inferences are ethically defensible. • The basis for warrants, the other plausible interpretations, and the ethical bases for defending the consequences, are themselves constantly open to scrutiny and question.

  12. Effective learning environments • A prevalent, mistaken, view • Teachers create learning • The teacher’s job is to do the learning for the learner • A not so prevalent, not quite so mistaken, but equally dangerous view • Only learners can create learning • The teacher’s job is to “facilitate” learning • A difficult to negotiate, middle path • Teaching as the engineering of effective learning environments • Key features: • Create student engagement (pedagogies of engagement) • Well-regulated (pedagogies of contingency) • Develop habits of mind (pedagogies of formation)

  13. Formative assessment: a definition • “An assessment functions formatively to the extent that evidence about student achievement elicited by the assessment is interpreted and used to make decisions about the next steps in instruction that are likely to be better, or better founded, than the decisions that would have been taken in the absence of that evidence. • Formative assessment therefore involves the creation of, and capitalization upon, moments of contingency (short, medium and long cycle) in instruction with a view to regulating learning (proactive, interactive, and retroactive).” (Wiliam, 2009)

  14. The formative assessment hi-jack… • Long-cycle • Span: across units, terms • Length: four weeks to one year • Impact: Student monitoring; curriculum alignment • Medium-cycle • Span: within and between teaching units • Length: one to four weeks • Impact: Improved, student-involved, assessment; teacher cognition about learning • Short-cycle • Span: within and between lessons • Length: • day-by-day: 24 to 48 hours • minute-by-minute: 5 seconds to 2 hours • Impact: classroom practice; student engagement

  15. Unpacking assessment for learning • Key processes • Establishing where the learners are in their learning • Establishing where they are going • Working out how to get there • Participants • Teachers • Peers • Learners

  16. Five “key strategies”… • Clarifying, understanding, and sharing learning intentions • curriculum philosophy • Engineering effective classroom discussions, tasks and activities that elicit evidence of learning • classroom discourse, interactive whole-class teaching • Providing feedback that moves learners forward • feedback • Activating students as learning resources for one another • collaborative learning, reciprocal teaching, peer-assessment • Activating students as owners of their own learning • metacognition, motivation, interest, attribution, self-assessment (Wiliam & Thompson, 2007)

  17. …and one big idea • Use evidence about learning to adapt instruction to better meet learner needs

  18. A model for professional change • Content • Evidence • Ideas • Process • Choice • Flexibility • Small steps • Accountability • Support

  19. KMO Formative Assessment Project • 24 teachers, each developing their practice in individual ways • Different outcome variables • No possibility of standardized controls • “Polyexperiment” with “local design” • Synthesis by standardized effect size

  20. Jack-knife estimate of mean effect size: 0.32; 95% C.I. [0.16, 0.48)

  21. Effect size by comparison type I Parallel set taught by same teacher in same year S Similar set taught by same teacher in previous year P Parallel set taught by different teacher in same year L Similar set taught by different teacher in previous year D Non-parallel set taught by different teacher in same year N National norms

  22. Summary • Educational research is a never-completed process of assembling evidence that: • particular inferences are warranted on the basis of the available evidence; • such inferences are more warranted than plausible rival inferences; • the consequences of such inferences are ethically defensible. • The basis for each of these is constantly open to scrutiny and question

More Related