1 / 101

Aligning Science Assessment to Content Standards

Aligning Science Assessment to Content Standards. George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon, An Michiels, Tom Regan, Jo Ellen Roseman, Paula Wilson Center for Curriculum Materials in Science Knowledge Sharing Institute Ann Arbor, Michigan July 10-12, 2006

arion
Télécharger la présentation

Aligning Science Assessment to Content Standards

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Project 2061: Student Assessment Aligning Science Assessment to Content Standards George DeBoer, Arhonda Gogos, Cari Herrmann Abell, Kristen Lennon, An Michiels, Tom Regan, Jo Ellen Roseman, Paula Wilson Center for Curriculum Materials in Science Knowledge Sharing Institute Ann Arbor, Michigan July 10-12, 2006 This work is funded by the National Science Foundation ESI 0352473

  2. Project 2061: Student Assessment Thanks to: • Abigail Burrows for organizing the pilot testing with schools. • Ed Krafsur for developing the assessment data base. • Brian Sweeney for developing illustrations for test items.

  3. Project 2061: Student Assessment Strand 6: Part I Examining the Project 2061 Criteria for Aligning Middle School Assessment Items to Learning Goals

  4. Project 2061: Student Assessment Aligning Student Assessment to Content Standards What We Are Doing: Project Background • Creating a bank of middle and early high school science assessment items that are precisely aligned with national content standards • Providing resources to support the creation and use of assessment items aligned to content standards • Developing a data base for these resources and a user interface to access the resources • In this session, we will focus on the criteria we use for judging alignment of assessment items to content standards.

  5. Project 2061: Student Assessment Resources We Will Provide • Clarifications of the content standards (elaboration, boundary setting, i.e., what’s in and what’s out). To add precision to the alignment of assessment items. • Summaries of research on student learning (misconceptions and other ideas students hold) related to the ideas in the content standards. To serve as distractors in assessment items. • Assessment maps (which include prerequisite ideas, related ideas, ideas that come later in the learning trajectory). Useful for developing test instruments on a specific topic. Also useful in item development for deciding what knowledge is reasonable to expect students to have (e.g., bedrock).

  6. Project 2061: Student Assessment List of Topics • Atoms, Molecules and States of Matter • Substances, Chemical Reactions and Conservation • Processes that shape the Earth / Plate Tectonics • Weather and Climate • Solar System • Energy Transformations • Force and Motion • Forces of Nature • Sight and Vision • Mathematics: Summarizing Data • Mathematics: Relationships among Variables

  7. Project 2061: Student Assessment List of Topics, Continued • Basic Functions in Humans • Cells and Proteins • Evolution and Natural Selection • Interdependence, Diversity and Survival • Matter and Energy Transformations in Living Systems • Sexual Reproduction, Genes and Heredity • Cross-cutting Themes: Models • Nature of Science: Claims of Causal Relationships • Nature of Science: Inductive Reasoning • Nature of Science: Empirical Validation of Ideas about the World • Nature of Science: Uncertainty and Durability

  8. Project 2061: Student Assessment Examples of: • Clarification statements • Summaries of research on student learning • Assessment maps • How each is used in the item development work.

  9. Project 2061: Student Assessment Idea B: All atoms are extremely small (from BSL 4D/M1a). • Students are expected to know that atoms are much smaller than very small items with which they are familiar, such as dust, blood cells, plant cells, and microorganisms, all of which are made up of atoms. Students should know that the atoms are so small that many millions of them make up these small items with which they are familiar. They should know that this is true for all atoms. The comparison with very small objects can be used to test students’ qualitative understanding of the size of atoms in relation to these objects. Students will not, however, be expected to know the actual size of atoms.

  10. Project 2061: Student Assessment Student Misconceptions Related to the Size of Atoms: • Atoms and/or molecules are similar in size to cells, dust, or bacteria (Lee et al., 1993; Nakhleh et al., 1999; Nakhleh et al., 2005). • Atoms and/or molecules can be seen with magnifying lenses or optical microscopes (Griffiths et al., 1992; Lee et al., 1993).

  11. Project 2061: Student Assessment

  12. Project 2061: Student Assessment Steps in the Item Development Procedure • Select a set of benchmarks and standards to define the boundaries of a topic • Tease apart the benchmarks and standards into a set of key ideas • Create an assessment map showing how the key ideas build on each other conceptually • Review the research on student learning to identify ideas students may have about the ideas • Design items: • using student misconceptions as distractors • using the assessment analysis criteria • following a list of design specifications

  13. Project 2061: Student Assessment Steps in the Item Development Procedure, con’t • Use open-ended interviewing to supplement published research on student learning • Use mini “item camps” to get feedback on items from staff • Revise items • Pilot test items and conduct think aloud interviews • Analyze pilot test data • Revise items • Conduct formal reviews of approximately 25 items using the assessment analysis criteria • Revise items • Conduct national field test of items

  14. Project 2061: Student Assessment Demonstration of the Database and User Interface: • Items • Misconception List • Topics, key ideas, clarifications • Assessment Maps • Item Specifications

  15. Project 2061: Student Assessment The Project 2061 Assessment Analysis Procedure

  16. Project 2061: Student Assessment There are six parts to the analysis procedure: • Exploring the Learning Goal • Determining Content Alignment • Determining Whether the Task Accurately Reveals What Students do or do not Know • Considering the Task’s Cost Effectiveness • Suggesting Revisions • Assessment Item Rating Form (not included in this version)

  17. Project 2061: Student Assessment Reviewers use the following materials: • Assessment Items • The content standard that is being targeted • Clarification statements • Lists of common student misconceptions and other ideas students may have. • Results of student interviews or field test results if available

  18. Project 2061: Student Assessment I. Exploration Phase • Determining the alignment of an assessment task to a learning goal requires a precise understanding of the meaning of the learning goal and what knowledge and skills are needed to successfully complete the task.

  19. Project 2061: Student Assessment A. The Learning Goal • Reviewers carefully read the clarification statement written for the targeted learning goal (content standard or benchmark). • Reviewers examine the list of misconceptions related to the targeted learning goal.

  20. Project 2061: Student Assessment B. The Assessment Task • Reviewers: • attempt to complete the task themselves. • list the knowledge and skill needed to successfully complete the task. • consider if there are different strategies that can be used to successfully complete the task. • consider which misconceptions might affect student answers.

  21. Project 2061: Student Assessment II. Determining the Content Alignment between the Learning Goal and the Assessment Task

  22. Project 2061: Student Assessment A. Necessity • To be content aligned, knowledge of the ideas described in the learning goal or the clarification statement, or knowledge that certain commonly held misconceptions are not true, must be needed to evaluate each of the answer choices.

  23. Project 2061: Student Assessment Reviewers are told: • If the knowledge in the learning goal is not needed to decide if the answer choices are correct or incorrect, explain how the answer choices can be evaluated using other knowledge.

  24. Project 2061: Student Assessment Applying the Necessity Criterion Which of the following is the smallest? A.  An atom B.  A bacterium C.  The width of a hair D.  A cell in your body

  25. Project 2061: Student Assessment Idea B: All atoms are extremely small (from BSL 4D/M1a). • Students are expected to know that atoms are much smaller than very small items with which they are familiar, such as dust, blood cells, plant cells, and microorganisms, all of which are made up of atoms. Students should know that the atoms are so small that many millions of them make up these small items with which they are familiar. They should know that this is true for all atoms. The comparison with very small objects can be used to test students’ qualitative understanding of the size of atoms in relation to these objects. Students will not, however, be expected to know the actual size of atoms.

  26. Project 2061: Student Assessment Applying the Necessity Criterion: The knowledge in the learning goal is needed to evaluate each answer choice.

  27. Project 2061: Student Assessment An example of an item for which the targeted knowledge is not needed: • Targeted Idea: Substances may react chemically in characteristic ways with other substances to form new substances with different characteristic properties (based on NSES 5-8B:A2a).

  28. Project 2061: Student Assessment Which of the following is an example of a chemical reaction? • A piece of metal hammered into a tree. • A pot of water being heated and the water evaporates. • A spoonful of salt dissolving in a glass of water. • An iron railing developing an orange, powdery surface after standing in air.

  29. Project 2061: Student Assessment Applying the Necessity Criterion: The knowledge in the learning goal is not needed. Answer choice D, the correct answer, is a specific instance of a general principle (SIGP). The student can get the item correct by knowing that rusting is a chemical reaction without knowing the general principle that new substances are formed that have different characteristic properties.

  30. Project 2061: Student Assessment B. Sufficiency • To be content aligned, knowledge of the ideas described in the learning goal or the clarification statement, or knowledge that certain commonly held misconceptions are not true, must be “all that is needed” to evaluate each of the answer choices. Students should not need any additional science knowledge.

  31. Project 2061: Student Assessment Reviewers are told: • If the knowledge in the learning goal is not enough to evaluate each of the answer choices, indicate what additional knowledge is needed. (Do not include as additional knowledge those things that can be assumed as general knowledge and ability of students this age.) • An example of additional knowledge might include science or mathematics terminology that students are not expected to know.

  32. Project 2061: Student Assessment Applying the Sufficiency Criterion Which of the following is the smallest? A.  An atom B.  A bacterium (clarification statement says “microorganism”) C.  The width of a hair D.  A cell in your body

  33. Project 2061: Student Assessment Applying the Sufficiency Criterion: • The sufficiency criterion is not met. Students need to know the term “bacterium,“ which is additional knowledge. Although a listed misconception includes the word “bacteria,” in pilot testing, 25% of 193 students indicated that they did not know what a bacterium was (even though most knew what bacteria were). The item should say “microorganism” or “bacteria” to match the clarification statement and/or misconception list.

  34. Project 2061: Student Assessment Applying the Sufficiency Criterion Approximately how many carbon atoms placed next to each other would it take to make a line that would cross this dot: Ÿ ? A.  6 B.  600 C.  6000 D.  6,000,000 Note: This item assumes a 1mm dot and a diameter of 1.5Å for a carbon atom.

  35. Project 2061: Student Assessment Applying the Sufficiency Criterion • The sufficiency criterion is met. Students need to know that like the other small things mentioned in the clarification statement, e.g., dust, plant cells, blood cells, and microorganisms, this small visible dot is also made of millions of atoms. Note: This item assumes a 1mm dot and a diameter of 1.5Å for a carbon atom.

  36. Project 2061: Student Assessment Idea B: All atoms are extremely small (from BSL 4D/M1a). (Not included in the workshop packet.) • Students are expected to know that atoms are much smaller than very small items with which they are familiar, such as dust, blood cells, plant cells, and microorganisms, all of which are made up of atoms. Students should know that the atoms are so small that many millions of them make up these small items with which they are familiar. They should know that this is true for all atoms. The comparison with very small objects can be used to test students’ qualitative understanding of the size of atoms in relation to these objects. Students will not, however, be expected to know the actual size of atoms [nor the order-of-magnitude relationships to other objects].

  37. Project 2061: Student Assessment III. Determining Whether the Task Accurately Reveals What Students Do and Do Not Know • It’s a validity issue. Students should choose the correct answer when they know the idea and they should choose an incorrect answer when they do not know the idea. • Getting rid of factors not related to the knowledge being measured (construct irrelevant factors) • Reducing false negatives and false positives

  38. Project 2061: Student Assessment A. Comprehensibility 1. It is not clear what question is being asked. Explain. 2. The task uses unfamiliar general vocabulary that is not clearly defined. List potentially unfamiliar vocabulary and explain. (Note: This is referring to general language usage, not technical scientific or mathematical terminology, which is addressed under Sufficiency.) • The task uses unnecessarily complex sentence structure or ambiguous punctuation that makes the task difficult to comprehend when plain language could have been used. Explain. (Note: Rebecca Kopriva, C-SAVE, Maryland.)

  39. Project 2061: Student Assessment Comprehensibility Continued: • The task uses words and phrases that have unclear, confusing, or ambiguous meanings. This may include commonly used words that have special meaning in the context of science. For example the word “finding” could be unfamiliar to students when referring to a scientific “finding.” Note all places where words, both general and scientific) do not have clear and straightforward meanings. • There is inaccurate information (including what is in the diagrams and data tables) that may be confusing to students who have a correct understanding of the science. Explain. • The diagrams, graphs, and data tables may not be clear or comprehensible. (For example, they may include extraneous information, inaccurate or incomplete labeling, inappropriate size or relative size of objects, etc.) Explain. • Other. Provide a brief explanation.

  40. Project 2061: Student Assessment Comprehensibility: • An item with comprehensibility issues.

  41. Project 2061: Student Assessment Most sidewalks made out of concrete have [cracks] [every few yards] as shown in the diagram below.  These are called [expansion joints] as labeled in the diagram below.  What happens to the width of the cracks during a hot day in the summer and why? A.  The cracks get wider because the concrete shrinks. B.  The cracks get wider because the concrete gets softer. C.  The cracks get narrower because the concrete expands. D.  The cracks get narrower because the ground underneath the sidewalk shrinks.

  42. Project 2061: Student Assessment Most sidewalks made out of solid concrete have spaces between the sections as shown in the diagram below.  What happens to the width of the spaces during a hot day in the summer and why? A.  The spaces get wider because the concrete shrinks. B.  The spaces get narrower because the concrete expands. C.  The spaces get stay the same because the concrete does not shrink or expand. D.  Some spaces get narrower and some get wider because some concrete expands and some concrete shrinks .

  43. Project 2061: Student Assessment B. Appropriateness of Task Context a. The context may be unfamiliar to most students. Explain. b. The context may advantage or disadvantage one group of students because of their interest or familiarity with the context. Explain. c. The context is complicated and not easy to understand so that students might have to spend a lot of time trying to figure out what the context means. Explain.

  44. Project 2061: Student Assessment Appropriateness of Task Context, Continued • The information and quantities that are used are not reasonable or believable. Explain. e. The context does not accurately represent scientific or mathematical realities or, if idealizations are involved, it is not made clear to students that it is an idealized situation. Explain. f. Other. Explain.

  45. Project 2061: Student Assessment C. Resistance to Test-Wiseness 1. Some of the distractors are not plausible. Explain. 2. One of the answer choices differs in length or contains a different amount of detail from the other answer choices. Explain. 3. One of the answer choices is qualified differently from the other answer choices, using words such as “usually” or “sometimes,” or an answer choice uses different units of measurement. Explain. 4. The use of logical opposites may lead students to eliminate answer choices. Explain.

  46. Project 2061: Student Assessment Resistance to Test-Wiseness, Continued • One of the answer choices contains vocabulary at a different level of difficulty from the other answer choices that may make it sound more scientific. Explain. 6. The language in one of the answer choices mirrors the language in the stem. Explain. 7. There are other test-taking strategies that may be used in responding to this task. Explain

  47. Project 2061: Student Assessment An item with test-wiseness issues: This item is targeted to Idea A from Matter and Energy Transformations in Living Systems: “Food is a source of molecules that serve as fuel and building material for all organisms.” Is the oxygen that animals breathe a kind of food? • Yes, because oxygen enters the body. M-A2 • Yes, because all animals need oxygen to survive. M-A3 • No, because animals do not get energy from oxygen. From clarification of Idea A. • No, because oxygen can enter an animal’s body through its nose. M-A1, M-A2.

  48. Project 2061: Student Assessment Misconceptions and other Ideas students may have: Matter and Energy Transformations: Idea A • Many children associate the word food with what they identify as being edible (Driver, 1984; Driver, Squires, Rushworth, & Wood-Robinson, 1994; Lee & Diong, 1999). • Students see food as substances (water, air, minerals, etc.) that organisms take [directly] in from their environment (Anderson, Sheldon, & Dubay, 1990; Simpson & Arnold, 1982). • Some students think that food is what is needed to keep animals and plants alive (Driver et al., 1994).

  49. Project 2061: Student Assessment Analyzing test-wiseness issues: Conclusion: Answer choice D (No, because oxygen can enter an animal’s body through its nose), is not a plausible explanation for why oxygen is not food. The answer choice is likely to be eliminated because of its implausibility, which is one of the factors (C1) used in assessing test-wiseness. (In pilot testing, 5 of 29 students selected this, thinking that the point of entry is what determines if something is food. Many others questioned how the nose is relevant in a question about food.) The answer choice could be improved by changing it to say that oxygen is not food because it is not edible (M-A1) or because it does not enter through an animal’s mouth.

  50. Project 2061: Student Assessment IV. Considering the Task’s Cost Effectiveness • Does the task require an inordinate amount of time to complete? Ask whether the time needed for students to read the question, make calculations, interpret a data table, or read a graph is warranted. Provide a brief explanation of why the task is not cost effective and how the same information might be elicited more efficiently.

More Related