1 / 53

Proficiency in Science: Assessment Challenges & Opportunities Jim Pellegrino

Proficiency in Science: Assessment Challenges & Opportunities Jim Pellegrino. Chinese Curse? .

adia
Télécharger la présentation

Proficiency in Science: Assessment Challenges & Opportunities Jim Pellegrino

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Proficiency in Science: Assessment Challenges & Opportunities Jim Pellegrino

  2. Chinese Curse? • There is a Chinese curse which says 'May he live in interesting times.' Like it or not we live in interesting times. They are times of danger and uncertainty; but they are also more open to the creative energy of men than any other time in history. • Robert Kennedy, 1966.

  3. New Definition of Competence • The NRC Science Framework & NGSS have proposed descriptions of student competence as being the intersection of knowledge involving: • important disciplinary practices • core disciplinary ideas, • and crosscutting concepts with • performance expectations representing the intersection of the three. • They view competence as something that develops over time & increases in sophistication and power as the product of coherent curriculum & instruction  

  4. Goals for Teaching & Learning Crosscutting Concepts Core Ideas Practices • Coherent investigations of core ideas across multiple years of schooling • More seamless blending of practices with core ideas • Performance expectations that require reasoning with core disciplinary ideas • explain, justify, predict, model, describe, prove, solve, illustrate, argue, etc.

  5. Aligning Curriculum,Instruction & Assessment Standards

  6. Challenges for Assessment • To develop assessment tasks that integrate the three dimensions. • To develop tasks that can assess where a student can be placed along a sequence of progressively more complex understandings of a given core idea, and successively more sophisticated applications of practices and crosscutting concepts. • To develop assessment tasks that measure the connections between the different strands of disciplinary core ideas (e.g. using understandings about chemical interactions from physical science to explain phenomena in biological contexts).

  7. assessment contexts, purposes and uses, the nature of assessment and the importance of research on learning, assessment design processes, affordances of technology, and systems of assessment 5 Things for Us to Talk About

  8. Educational assessment typically occurs in multiple contexts: Small scale: individual classrooms Intermediate-scale: districts Large-scale: states, nations Within and across contexts it can be used to accomplish differing purposes: Assist learning (formative) Measure individual achievement (summative) Evaluate programs (accountability) Contexts and Purposes

  9. The reason we have so many different forms and types of assessment is that “One size does not fit all” Educators at different levels of the system need different information at different time scales They have differing priorities, they operate under different constraints, & there are tradeoffs in terms of time, money, and type of information needed A Multiplicity of Actors & Needs

  10. assessment contexts, purposes and uses, the nature of assessment and the importance of research on learning, assessment design processes, affordances of technology, and systems of assessment 5 Things for Us to Talk About

  11. interpretation observation cognition Assessment as a Process of Reasoning from Evidence • cognition • theory, models, and data about how students represent knowledge & develop competence in the domain • observations • tasks or situations that allow one to observe students’ performance • interpretation • methods for making sense of the data Must be coordinated!

  12. Why Models of Development of Domain Knowledge are Critical • Tell us what are the important aspects of knowledge that we should be assessing. • Give us strong clues as to how such knowledge can be assessed • Can lead to assessments that yield more instructionally useful information • diagnostic & prescriptive • Can guide the development of systems of assessments intended to cohere • across grades & contexts of use

  13. The Goldilocks Problem • Not too big, not too little, just right!!!! • Not too many things, not too few things, just the right grain size!!

  14. assessment contexts, purposes and uses, the nature of assessment and the importance of research on learning, assessment design processes, affordances of technology, and systems of assessment 5 Things for Us to Talk About

  15. Issues of Assessment Design & Development • Assessment design spaces vary tremendously & involve multiple dimensions • Type of knowledge and skill and levels of sophistication • Time period over which knowledge is acquired • Intended use and users of the information • Availability of detailed theories & data in the domain • Distance from instruction and assessment purpose • Need a principled process that can help structure going from theory, data and/or speculation to an operational assessment • Evidence Centered Design

  16. Exactly what knowledge do you want students to have and how do you want them to know it? What task(s) will the students perform to communicate their knowledge? Evidence-Centered Design What will you accept as evidence that a student has the desired knowledge? How will you analyze and interpret the evidence? claim space evidence task Need to consider what this might mean when it comes to performance expectations integrating Disciplinary Core Ideas & Practices

  17. AP Content & Science Practices Use representations and models to communicate scientific phenomena and solve scientific problems. Use mathematics appropriately. Engage in scientific questioning to extend thinking or to guide investigations within the context of the AP course. Plan and implement data collection strategies in relation to a particular scientific question. Perform data analysis and evaluation of evidence. Work with scientific explanations and theories. Connect and relate knowledge across various scales, concepts, and representations in and across domains. Core Ideas Science practices

  18. Illustrative Claims and EvidenceAP Biology EU 1A: Change in the genetic makeup of a population over time is evolution. Big Idea 1: The process of evolution drives the diversity and unity of life. L3 1A.3: Evolutionary change is driven by genetic drift and artificial selection. Skill 6.4: The student can make claims and predictions about natural phenomena based on scientific theories and models. The Claim: The student can make predictions about the effects of natural selection versus genetic drift on the evolution of both large and small populations of organisms. The Evidence: The work will include a prediction of the effects of either natural selection or genetic drift on two populations of the same organism, but of different sizes; the prediction includes a description of the change in the gene pool of a population; the work shows correctness of connections made between the model and the prediction and the model and the phenomena (e.g. genetic drift may not happen in a large population of organisms; both natural selection and genetic drift result in the evolution of a population). Suggested Proficiency Level: 4

  19. Connecting the Domain Model to Curriculum, Instruction, & Assessment

  20. Sample AP Bio Task

  21. assessment contexts, purposes and uses, the nature of assessment and the importance of research on learning, assessment design processes, affordances of technology systems of assessment 5 Things for Us to Talk About

  22. A Claim We Need to Embrace • Much of what is new, different, and important in the NRC Science Framework and NGSS cannot be adequately assessed by conventional methods, items, and measurement models • The capacity to engage in the practices as connected to important domain-specific ideas and understandings • We are interested in the processes of thinking as well as the products of those processes

  23. Advantages of Technology for Science Assessment Present authentic, rich, dynamic environments Present phenomena difficult or impossible to observe and manipulate in classrooms Represent temporal, causal, dynamic relationships “in action” Allow multiple representations of stimuli and their simultaneous interactions (e.g., data generated during a process) Allow overlays of representations, symbols Allow student manipulations/investigations, multiple trials Allow student control of pacing, replay, reiterate Capture student responses during research, design, problem solving

  24. SimScientists: Levels of Analysis and Understanding

  25. Students view animation to observe relationships among organisms draw food web illustrating those relationships. Life Science Simulation

  26. Life Science Simulation In the experiment that you just analyzed, the amount of alewife was set to 20 at the beginning. Another student hypothesized that the result might be very different if she started with a larger or smaller amount of alewife at the beginning. Run three experiments to test that hypothesis. At the end of each experiment record your data by taking pictures of the resulting graphs. After three runs, you will be shown your results and asked if it makes any difference if the beginning amount of alewife is larger or smaller than 20.

  27. Embedded Assessments with Formative Feedback

  28. assessment contexts, purposes and uses, the nature of assessment and the importance of research on learning, assessment design processes, affordances of technology, and systems of assessment 5 Things for Us to Talk About

  29. Desired end product is a multilevel system • Each level fullfills a clear set of functions and has a clear set of intended users of the assessment information • The assessment tools are designed to serve the intended purpose • Formative, summative or accountability • Design is optimized for function served • The levels are articulated and conceptually coherent • They share the same underlying concept of what the targets of learning are at a given grade level and what the evidence of attainment should be. • They provide information at a “grain size” and on the “time scale” appropriate for translation into action.

  30. What Such a System Might Look Like An Integrated System Coordinated across levels Unified by common learning goals Synchronized by unifying progress variables Multilevel Assessment System

  31. The Key Design Elements of Such a Comprehensive System • The system is designed to track progress over time • At the individual student level • At the aggregate group level • The system uses tasks, tools, and technologies appropriate to the desired inferences about student achievement • Doesn’t force everything into a fixed testing/task model • Uses a range of tasks: performances, portfolios, projects, fixed- and open-response tasks as needed

  32. Example 1: Aggregating Information Across Levels

  33. Example 2: Multi-levelTask Correspondence

  34. What are some “Grand Challenges” for science assessment Where do we stand in meeting them? What’s left to do? Will we have tests worth teaching to? Can we avoid the sins of the past? Five Questions forUs to Ponder

  35. Science Assessment: Grand Challenges

  36. Where Do We Stand inMeeting these Challenges? • We have a much better sense of what the development of competence should mean and the possible implications for designing coherent science education • We have examples of thinking through in detail the juxtaposition of disciplinary practices and core content knowledge to guide the design of assessment • AP Redesign Project • Multiple Assessment R&D Projects

  37. Connecting the Domain Model to Curriculum, Instruction, & Assessment

  38. Structure of the AP Biology Curriculum Framework 4 Big Ideas Enduring Understandings Science Practices:Science Inquiry & Reasoning Essential Knowledge Learning Objectives

  39. Curriculum Framework:Big Ideas The unifying concepts or Big Ideas increase coherence both within and across disciplines. A total of Four Big Ideas: 1 The process of evolution drives the diversity and unity of life. B I G I D E A 2 Biological systems utilize energy and molecular building blocks to grow, reproduce, and maintain homeostasis. B I G I D E A 3 Living systems retrieve, transmit, and respond to information essential to life processes. B I G I D E A 4 Biological systems interact, and these interactions possess complex properties. B I G I D E A

  40. Immediate Impacts of AP Biology Changes • ~10,000-12,000 high school biology teachers across the country all changed the way they taught AP Biology…at the same time • For many teachers, they had to replace all their laboratory investigations. For all of them, they had to incorporate inquiry activities throughout the course, not just use inquiry in a few labs. For some, incorporation of mathematical skills is a challenge. • In May ~180,000 students took the new AP Biology exam

  41. ECD and the New AP Exam No test items will focus on low cognitive level/declarative knowledge/recall For each exam item, students will either produce the evidence (CR) or engage with the evidence (SR/MC) • explain • justify • predict • evaluate • describe • analyze • pose scientific questions • construct explanations • construct models • represent graphically • solve problems • select and apply mathematical routines

  42. What’s The Impact Of Curriculum Changes On New AP Biology Exam? Because of use of Big Ideas….in 2008, 12% of questions had something to do with evolution In new exam, 35% of questions have something to do with evolution Because of emphasis on science practice and mathematical skills…new types of questions are being asked, e.g., grid-ins Because of use of evidence…the number of Multiple Choice questions was reduced from 100 questions on last year’s exam to 63 on this year’s exam.

  43. Lessons Learned from the AP Redesign Project No Pain -- No Gain!!! -- this is hard work Backwards Design and Evidence Centered Design are challenging to execute & sustain Requires multidisciplinary teams Requires sustained effort and negotiation Requires time, money & patience Value-added -- Validity is “designed in” from the start as opposed to “grafted on” Elements of a validity argument are contained in the process and the products

  44. What’s Left to Do? – A LOT!!! • We need to translate the standards into effective models, methods and materials for curriculum, instruction, and assessment. • Need clear performance expectations • Need precise claims & evidence statements • Need task models & templates • We need to use what we know already to evaluateand improve the assessments that are part of current practice, e.g., classroom assessments, large-scale exams, etc.

More Related