1 / 54

Unpacking the Expectations for Classroom Assessment and Instruction

Unpacking the Expectations for Classroom Assessment and Instruction. Michigan Council for the Social Studies Annual State Professional Development Conference. Stan Masters Lenawee ISD February 19, 2008. POP. Purpose

ahava
Télécharger la présentation

Unpacking the Expectations for Classroom Assessment and Instruction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Unpacking the Expectations for Classroom Assessment and Instruction Michigan Council for the Social Studies Annual State Professional Development Conference Stan Masters Lenawee ISD February 19, 2008

  2. POP • Purpose • Analyzing the new GLCEs and HSCEs for good classroom assessment and instruction, leading to increased student achievement • Objectives • Differentiate between the purposes of assessment • Unpack expectations into targets • Match targets to methods of assessment • Develop a set of assessments for your classroom • Procedure • PowerPoint slides for presenting information • Practice with the expectations • Use of templates and protocols

  3. Keys to Quality Classroom Assessment • Clear Purposes • Clear Targets • Good Design & Methods • Sound Communication • Student Involvement

  4. Indicators of Sound Classroom Assessment Practice(p.27) Skill in gathering accurate information + Effective use of information and procedures ____________________________________ Sound Classroom Assessment Practice

  5. Keys to Quality Classroom Assessment • Clear Purposes • Clear Targets • Good Design & Methods • Sound Communication • Student Involvement

  6. Deepening our ideas about assessment • What is the distinction between… assessment for learning assessment of learning?

  7. assessment for learning diagnostic (given before instruction to gather information on where to start) formative (monitors student progress during instruction) assessment of learning summative (the final task at the end of a unit, a course, or a semester) Purposes of Assessments Adapted from Braveman, S. L. (Ed Week, March 17, 2004)

  8. Both are needed! • Students need to know…(p.34) • Where they are going • Where they are now • How to close the gap • Teachers need to find balance…(p.35-36) • to improve student achievement • to communicate to various stakeholders

  9. Seven Strategies of Assessment for Learning (p.42) • Where am I going? • Clear targets • Models of work • Where am I now? • Descriptive Feedback • Student self-assessment/goal setting • How can I close the gap? • Lessons that focus on one target at a time • Teaching self-reflection • Student record-keeping

  10. So, do your students know what are the targets for their learning?

  11. Keys to Quality Classroom Assessment • Clear Purposes • Clear Targets • Good Design & Methods • Sound Communication • Student Involvement

  12. Where does curriculum come from? • National content organizations documents • State standards documents • Local curriculum is then created from these documents • Organized into units • Determine essential questions and key concepts • Aligned with state accountability assessments

  13. Backward Design Addresses All Three Parts of the Curriculum Triangle Content Assessment Instruction

  14. Problems with Our Curriculum • It sits on a shelf. • We go no further than creating units, activities, and/or projects. • We rely on a textbook. • Teachers disagree on the outcomes. • There are too many outcomes.

  15. Kinds of Learning Targets Stiggins, Arter, Chappuis, and Chappuis. (2006). Classroom Assessment for Student Learning. Portland, OR: ETS. • Knowledge – The facts and concepts we want students to know and understand. • Reasoning – Students use what they know to reason and solve problems • Skills – Students use their knowledge and reasoning to act skillfully • Products – Students use their knowledge, reasoning, and skills to create a concrete product. • Dispositions – Students’ attitudes about school and learning. (p. 75)

  16. Helpful Hints to Targets (p.64) • Knowledge targets are identified in the noun/noun phrase found in the benchmark • Reasoning targets are identified in the verb/verb phrases found in the benchmark • analytical, compare/contrast, synthesis, classification, inference/deduction, evaluative (p.70) • Skill targets always have knowledge targets • Product targets have to be discerned apart from the product tasks we ask students to create • Disposition targets reflect attitudes or feelings

  17. (BUT I WANT THEM TO DEEPLY APPRECIATE THE USEFULNESSES OF BAR GRAPHS) Organize data using concrete objects, pictures, tallies, tables, charts, diagrams, and graphs

  18. Practice Unpacking • Choose a outcome (benchmark/expectation) that your students will learn and you will teach in an upcoming unit of instruction. • Write the outcome at the top of your target/method planning sheet. • Complete the left hand side of the chart. • Knowledge/understanding, reasoning, skills, products, dispositions • Check your understanding of the targets with a partner • As a group: • Dialogue about your interpretation of the identified targets • Determine and note if there are any targets that need to added, changed, or deleted

  19. Unpacking for the Student • Targets are clearer for the student when they are put into positive “I can” statements. • They may be unpacked to include more concrete understandings I CAN

  20. Create “I Can” Statements • Using your previous unpacked learning outcome, create “I can” statements for your students.

  21. Keys to Quality Classroom Assessment • Clear Purposes • Clear Targets • Good Design & Methods • Sound Communication • Student Involvement

  22. Assessment StudyDonegal School District, Donegal, PAhttp://www2.yk.psu.edu/~jlg18/dragon/index.html • Baseline data for 1999-2000 • collected 661 tests/assessments during targeted collection period • randomly selected 20% or 142 for a sample

  23. Findings • Testing of low-level cognition (understanding and comprehension levels on Bloom's Taxonomy) predominated all types of testing at all levels. (75.5%) 2. Traditional formats of multiple choice, true and false, matching, fill-in the-blank predominated all other formats. (80%) 3. Short answer writing was never scored using a rubric. (0%) 4. Essay formats are very rarely used (.05%) and when used rarely were scored with a rubric (.02%).

  24. Findings 5. Rubrics that were available were often poorly crafted with checklist formats sometimes (33%) being represented as rubrics. 6. Problem-solving at any level above comprehension was rarely required (.04%), never scored with a rubric (0%) and problem-solvers were rarely called upon to write to justify or explain process or appropriateness of answer to problem posed (.04%). 7. Performance items were most often score sheets for projects where students had a tangible product to be evaluated. Rubrics rarely existed for such performances (.14%). 8. Performances never (0%) involved a written explanation of the process used or anything else.

  25. Plan of Action • Professional development on assessment • Unpacked expectations for assessment • Developed a standards template for designing assessment tasks • Met in teams to analyze assessments

  26. assessment for learning diagnostic (given before instruction to gather information on where to start) formative (monitors student progress during instruction) assessment of learning summative (the final task at the end of a unit, a course, or a semester) Purposes of Assessments Adapted from Braveman, S. L. (Ed Week, March 17, 2004) Ma and Pa Kettle Ma and Pa Kettle

  27. Talking PointsPresentation by Jay McTighe, November 30, 2007, Macomb ISD • “Students should be presumed innocent of understanding until convicted by evidence.” • Prior knowledge is like the largest part of the iceberg. • “Think photo album versus snapshot” when it comes to assessment

  28. Formative Assessment TechniquesSource: Fisher, D. and Frey, N. (2007). Checking for Understanding. Alexandria, VA: ASCD, pp. 5-12 • Main points: • Aligns with enduring understandings • Allows for differentiation • Focuses on gap analysis • Leads to precise teaching

  29. Formative Assessment techniques • Oral Language • Accountable talk, nonverbal cues, value lineups, retellings, think-pair-share, whip around • Questions • Response cards, hand signals, personal response systems, Socratic seminars • Writing • Interactive writing, read-write-pair-share, summary writing, RAFT • Tests • Multiple choice with misconceptions as distracters, short answer with word banks, true-false items with correction for the false items

  30. Methods of Assessment Stiggins, Richard J, Arter, Judith A., Chappuis, Jan, Chappius, Stephen. Classroom Assessment for Student Learning. Assessment Training Institute, Inc., Portland, Oregon, 2004, p. 91-93. • Selected response • one answer is correct; sometimes taken from a list • Extended written response • constructed into sentences; criteria given for quality • Performance assessment • observed product of learning; criteria given for quality • Personal communication • interaction with student; uses checklist or criteria

  31. Organize data using concrete objects, pictures, tallies, tables, charts, diagrams, and graphs Selected Extended Written Selected Extended Written Performance Personal Performance Personal Communication Extended Written Performance

  32. Activity • Individually: • On your right hand side of the chart of your target/method planning sheet, list the methods that would be the best matches for the targets you have identified.

  33. assessment for learning diagnostic (given before instruction to gather information on where to start) formative (monitors student progress during instruction) assessment of learning summative (the final task at the end of a unit, a course, or a semester) Purposes of Assessments Adapted from Braveman, S. L. (Ed Week, March 17, 2004)

  34. Methods of Assessment • Selected response • Extended written response • Performance assessment • Personal communication AUTHENTIC

  35. Authentic Academic Achievement • Construction of Knowledgeproducing meaning from prior experiences • Disciplined Inquiry cognitive work for in-depth understanding • Value Beyond School meaning apart from documenting competence Newmann, Secada, and Wehlage, “A Guide to Authentic Instruction and Assessment”, 1995

  36. Seven Standards forAssessment Tasks • Organization of Information • Consideration of Alternatives • Disciplinary Content • Disciplinary Process • Elaborated Written Communication • Problem Connected to the World Beyond School • Audience Beyond the School Newmann, Secada, and Wehlage, “A Guide to Authentic Instruction and Assessment”, 1995

  37. Examples of Assessment Tasks • Students will design a poster showing the history of a major city of a U.S. region. • Students will conduct a lab experiment on states of water, recording observations of freezing and thawing points. • Students will tell about three different events in their week, identifying correctly when each occurs. • Students will collect data on the number and type of forest animals and create an graphic representation of the populations. • Students will make a PowerPoint presentation to a younger audienceabout a tribe of Michigan Native Americans. • Students will write a persuasive essay about a position on a current monetary or fiscal policy that addresses unemployment.

  38. Components of an Authentic Assessment Task • What “new” prompt will you use to trigger “old” learning from prior instruction? • A prompt is the stimulus material given to students at the time of assessment which activates prior knowledge relevant to the task. • While carrying out the assessment task, the student uses the prompt to produce discourse, a performance, or a tangible object. • A prompt could be presented through various media, e.g., print, auditory, or visual. • Prompts might also take various forms, e.g., reading, graphic, motion picture, recording, map, data set, etc.

  39. Example of Prompt Letter from an Immigrant Dear Marta, I hope you received my letter telling you that I am now an American citizen. We have an election for mayor in my city in one month. I will be able to vote for the first time in my life. I have learned as much as I can about the two candidates for mayor. I think that Bonnie Kalinowski is clearly my choice. I wanted to learn more about American history to I am going to night school. I go two nights a week after work. I must stop for now. I have homework for my class! I will write again soon. Sincerely, Jacob

  40. Components of an Authentic Assessment Task • What directions will you give to the students completing the task? • The students being assessed are the audience for these directions. • These directions should be included just as they would be given to students at the time they are directed to perform the assessment task. • They should include a very clear statement of the product students are expected to generate as a result of performing the assessment task as well as the criteria that will be used to gauge the quality of student work, i.e., the scoring rubric.

  41. Example of Directions • “We have been learning about how important the right to vote is. Jacob as a new American citizen is certainly excited about gaining this right. He needs help, however, finding ways to take a more active part in the election. Write Jacob a letter explaining why you think it is important for him to become involved in the election campaign. Then, describe three different ways he could help Ms. Kalinowski become mayor. Make sure to explain your suggestions clearly.”

  42. Components of an Authentic Assessment Task • What procedures will you use as the teacher administering the task? • The steps to be followed by the teacher in conducting the assessment should be listed, and each step should be briefly elaborated. • These procedures should be written so that another teacher, new to the assessment task, could carry them out.

  43. Example of Procedures • Read aloud the prompt with students. Ask the students if there are any questions regarding the reading. Then, go over the directions for the assessment task and the rubric. Finally, provide time for the students to complete the extended response individually.

  44. Components of an Authentic Assessment Task • What scoring rubric will you use to evaluate the quality of the students’ task? • The assessment task should provide for individual student accountability. • The scores are cumulative; each higher score entails the criteria of the lower scores. Each higher score requires that something be added to the quality of student work not required for the next lower score. • The criteria for each score should specify “how good is good enough” for that score to be assigned.

  45. A rubric is… a set of scoring guidelines/criteria that describes a range of possible student responses for a particular assessment task. Adapted from Arter and McTighe (2001). Scoring Rubrics in the Classroom.; Nolet And McLaughlin (2000). Accessing the General Curriculum.

  46. A rubric contains… • a scale that indicates the points that will be assigned to a student’s work (different levels of proficiency); and • a set of meaningful descriptors for each point on that scale. (Descriptors establish the continuum of competence along which a learner moves towards proficiency.) Rubrics are frequently accompanied by examples of products or performances illustrating the different score points for proficiency (anchor papers).

  47. Why use a rubric? • Communicate appropriate standards and expectations for students (“what will count”) • Provide feedback to students and parents • Guide and focus instruction • Promote student self-assessment and goal setting • Improve grading consistency --judgments become more objective, consistent, and accurate Stiggins, Richard J, Arter, Judith A., Chappuis, Jan, Chappius, Stephen. Classroom Assessment for Student Learning. Assessment Training Institute, Inc., Portland, Oregon, 2004, p. 200.

  48. Features of High-Quality Rubrics • Content—What counts? • “Look fors” (essential traits), quality over quantity • Clarity—Does everyone understand what is meant? • Practicality—Is it easy to use by teachers and students? • Technical quality/fairness—Is it reliable and valid? Stiggins, Richard J, Arter, Judith A., Chappuis, Jan, Chappius, Stephen. Classroom Assessment for Student Learning. Assessment Training Institute, Inc., Portland, Oregon, 2004, p. 201 and 203

  49. Holistic Rubric: Gives a single score or rating for the entire product or performance based on an overall impression of a student’s work. Used with summative assessments and standardized tests. Analytical Rubric: Divides a product or performance into essential traits or dimensions (“Look Fors”) so they can be judged separately. Provides a profile of strengths and weaknesses. Used with formative assessments Holistic or Analytical Rubrics?

  50. Example of Rubric 0 = the criteria for a score of 1 have not been met.

More Related