1 / 36

Assessment for learning in science: Issues in learning to interpret student work

Assessment for learning in science: Issues in learning to interpret student work. Center for the Assessment and Evaluation of Student Learning (CAESL) University of California, Berkeley University of California, Los Angeles Lawrence Hall of Science. UCLA Shaunna Clark Joan Herman

Télécharger la présentation

Assessment for learning in science: Issues in learning to interpret student work

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment for learning in science: Issues in learning to interpret student work • Center for the Assessment and Evaluation of Student Learning (CAESL) • University of California, Berkeley • University of California, Los Angeles • Lawrence Hall of Science

  2. UCLA Shaunna Clark Joan Herman Sam Nagashima Ellen Osmundson Terry Vendlinski WestEd Diane Carnahan Karen Cerwin Kathy DiRanna Jo Topps Lawrence Hall of Science Lynn Barakos Craig Strang U.C. Berkeley Maryl Gearhart Jennifer Pfotenhauer Cheryl Schwab

  3. Presentation • Program and participants • Research framework • Design and methods • Selected findings • Implications 3

  4. Impetus for program Situation • assessments in materials of variable quality • teachers lack expertise to revise • professional practices not well established • Argument • science education reform (NRC/NSES) • known benefits of classroom assessment (e.g., Black & Wiliam, 1998; Sloane & Wilson, 2000) • value of reflective practice and long term collaboration (Garet et al, 2001) 4

  5. CAESL Leadership Academy 7/03 - 12/04 • Principles • integrated with practice • long term • collaborative • reflective practice • Core strategy • assessment portfolio 5

  6. Academy participants 6

  7. Program organization • Interwoven structures • district vertical teams w/ administrators • cross district grade level teams • independent classroom implementation • Series of portfolios • repeated opportunities to build expertise 7

  8. Portfolio: I. Plan • Establish learning goals • analyze ‘conceptual flow’ of materials • align with standards • Select assessments • choose key assessments to track progress: pre -> junctures -> post • identify the concepts assessed • anticipate ‘expected student responses’ 8

  9. Portfolio: II. Implementation • Interpret student work • refine assessment criteria • score • chart and identify trends • Use evidence • document instructional follow-up and feedback 9

  10. Portfolio: III. Evaluate & revise • Evaluate using student work • alignment with goals and instruction • quality of tasks and criteria • methods of analysis • Revise and strengthen 10

  11. Portfolio Completion • Rated for completeness Complete: I, II (some student work), III Partial: I or III, II (some) Minimal: I or III only None (but participating) 11

  12. Portfolio Completion 12

  13. Study Focus • Growth in understanding and practice • Supports and barriers Longitudinal, nested design • 18 months = 3 portfolios • Cohort: Surveys, focus groups, portfolios • Cases: Interviews and observations 13

  14. Framework for classroom assessment expertise • Understanding of assessment concepts • Facility with assessment practices 14

  15. UNDERSTANDING ASSESSMENT CONCEPTS QUALITY GOALS FOR STUDENT LEARNING AND PROGRESS QUALITY TOOLS QUALITY USE 15

  16. UNDERSTANDING ASSESSMENT CONCEPTS QUALITY GOALS FOR STUDENT LEARNING AND PROGRESS QUALITY TOOLS QUALITY USE SOUND INTERPRETATION 16

  17. CLASSROOM ASSESSMENT PRACTICES ESTABLISH GOALS & DESIGN ASSESSMENT PLAN REVISE ASSESSMENTS PROVIDE INSTRUCTION USE INFORMATION TO GUIDE TEACHING & LEARNING ASSESS INTERPRET STUDENT WORK 17

  18. USING CONCEPTS TO GUIDE PRACTICE ESTABLISH GOALS & DESIGN ASSESSMENT PLAN REVISE ASSESSMENTS PROVIDE INSTRUCTION USE INFORMATION TO GUIDE TEACHING & LEARNING ASSESS • Soundness of interpretation: • criteria capture student understanding? • - scoring consistent? • - interpretation appropriate to purpose? INTERPRET STUDENT WORK 18

  19. USING CONCEPTS TO GUIDE PRACTICE ESTABLISH GOALS & DESIGN ASSESSMENT PLAN REVISE ASSESSMENTS PROVIDE INSTRUCTION USE INFORMATION TO GUIDE TEACHING & LEARNING ASSESS • Soundness of interpretation: • criteria capture student understanding? • - scoring consistent? • - interpretation appropriate to purpose? INTERPRET STUDENT WORK 19

  20. Selected findings • Portfolios: Patterns of change • assessment criteria • analysis of whole class patterns • alignment • Exit survey and focus groups • perceived growth • supports, barriers, needs 20

  21. Patterns in portfolios Source • series of 2 or 3 portfolios(n ≈ 10) Issues & constraints • burden of documentation • paper & pencil assessments • professional choice 21

  22. from global/holistic to more specific, differentiated, and assessable • from focus on surface features to efforts to capture student understanding • from dichotomous (right/wrong) to attention to qualitative levels of understanding • but … quality variable despite teacher interest (example: reliance on content analysis or notes) • Assessment criteria 22

  23. from few to efforts at systematic analysis using charting or content analysis • from global patterns toward more differentiated analysis (item analysis, item clustering) and efforts to coordinate group & individual patterns • efforts to analyze progress (espec. pre-post) • but …information oftenunintegrated, inferences unsystematic, comparisons inappropriate • Whole class analysis 23

  24. efforts to align interpretations with learning goals, tasks, and criteria • efforts to revise criteria to strengthen alignment • fewer inferences about ancillary skills not assessed • but … problematic alignment of assessments and inferences to track progress • Alignment 24

  25. Exit survey Understanding of CAESL • 1 (none) <--> 5 (full) Implementation of CAESL • 1 (none) <--> 4 (full) <--> 5 (beyond) 25

  26. UNDERSTANDING 26

  27. IMPLEMENTATION 27

  28. Exit focus groups 28

  29. PRACTICES STRENGTHENED MOST? ESTABLISH GOALS & DESIGN ASSESSMENT PLAN REVISE ASSESSMENTS PROVIDE INSTRUCTION USE INFORMATION TO GUIDE TEACHING & LEARNING ASSESS INTERPRET STUDENT WORK 29

  30. PRACTICES STRENGTHENED LEAST? ESTABLISH GOALS & DESIGN ASSESSMENT PLAN REVISE ASSESSMENTS PROVIDE INSTRUCTION USE INFORMATION TO GUIDE TEACHING & LEARNING ASSESS INTERPRET STUDENT WORK 30

  31. PRACTICES MOST IMPORTANT? ESTABLISH GOALS & DESIGN ASSESSMENT PLAN REVISE ASSESSMENTS PROVIDE INSTRUCTION USE INFORMATION TO GUIDE TEACHING & LEARNING ASSESS INTERPRET STUDENT WORK 31

  32. Supports Portfolio • establishing goals • revision and re-application • resource for collaboration Resources • CAESL framework • articles and books • grade level teams & facilitators • long term program 32

  33. Barriers Portfolio • assessment development • only paper and pencil • focus on perf. assessments • time Resources • weak assessments • limited frameworks • no clear models for progress • gaps in teacher knowledge Context • standards • testing • school & district 33

  34. Barriers Portfolio • assessment development • only paper and pencil • focus on perf. assessments • time Resources • weak assessments • limited frameworks • no clear models for progress • gaps in teacher knowledge Context • standards • testing • school & district Unnamed challenges • inquiry • ancillary skills 34

  35. Requests Resources • embedded assessments • handbook • conceptual development • grade level collaboration • coaching and facilitation Portfolio …if…. • streamline • focus on goals, interpretation, and use • refinement not development • expand assessment types Context • align with district and state assessments 35

  36. Implications • Strengthen materials & resources • Expand to unit assessment systems Align assessment content and quality Modify program organization 36

More Related