Assessment Bites Workshop 2RUBRICS 101 Defining Criteria for Quality
Workshop Objectives Participants will be able to: • Eat lunch and talk to each other. • Identify types and components of rubrics. • Experiment with existing rubrics or construct a new rubric for use in their courses, or for assessment in their department or the Core.
Assessment “The aim of assessment is primarily to educate and improve student performance, not merely to audit it.” (Wiggins, 1998, p. 7)
TTYPA… Discuss any assessment activities that enhance student learning while providing the teacher with feedback about students’ progress.
Make a class presentation vs. Write an essay vs. Take a test on percents and compounded interest vs. Know the facts of history vs. Do a graphics assignment vs. Speak to raise funds and get elected Get a letter to the editor published Choose the most effective investment plan based on several options Design an exhibit for a history museum Satisfy a design client (From Wiggins, 1998 p. 37-38.) “The challenge is to design tasks that require thoughtful responsiveness, not just plugged-in knowledge and skill.”
To evaluate complex tasks or competencies, we need PerformanceCriteria • Guidelines by which student work (written responses, products, or performances) is judged • What to look for to determine quality • What to look for that reveals true understanding • The more complex the skill, the more important it is to have clear criteria (and often the more difficult it is to define them)
Rubrics • are assessment tools for judging complex tasks or competencies • describe criteria for quality and define levels of performance, and make this information public • help teachers evaluate work fairly and consistently • help students understand what constitutes quality and assess their own work
Rubric Task DescriptionTask Description: Assignment, such as Biology Lab Report, Oral Presentation, Poster, Senior Thesis
Rubric Construction Practice Developing descriptorsTask Description: “Cleaning Your Room”
Assigns a level of performance by looking at multiple criteria in the aggregate. Used for making quick judgments, or for grading complex work that varies widely in form (such as a capstone project or art portfolio.) Articulates performance on each criterion. For example, a history research paper might have criteria for organization, historical accuracy, and writing conventions. Holistic vs. Analytic Rubrics
Holistic Homework Rubric Most answers correct and all work shown. Reflective synthesis of reading. Some answers correct and some work shown. Good summary of reading but lacking synthesis or reflection. Few/no answers correct and little or no work shown. No summary of reading or simply quoting from text.
Rubrics for one specific assignment. Used to spell out instructions and required elements of the task for students. Specific and relevant, but can be time consuming. Option: Staged Rubric w/ deadlines Rubrics that can be applied to multiple assignments, and possibly to multiple disciplines. For example, oral presentation, senior thesis, portfolio, and critical thinking rubrics. Task Specific vs. GenericRubrics
Rubric Construction PracticeDeveloping criteria and descriptorsTask description: “Cleaning Your Room”
Larger scales Often appropriate for more complex tasks Developmental rubrics may need larger scales May be harder to get scorer consistency May be harder to distinguish among multiple levels Smaller Scales Four points common for standards, with “3” signifying the standard Five point scales may be confused w/ A-F grades Fewer than four points may not be enough to distinguish quality (Arter & McTighe, 2001, p. 31.) Scale SizeRubrics may vary from 3 to 7 score points, or more for developmental rubrics. Scales can be numerical, qualitative or a combination (e.g., 1, 2, 3 or good, better, best). Ask: “How many points are needed to describe the range of performance I am seeing in student work?”
Getting started with Rubrics • Start by looking at a variety of sample rubrics. • Adapt an existing rubric for your purposes, test it by scoring and sorting student work, then fine tune the rubric based on use. • Or, start from scratch by listing performance criteria and indicators, and by sorting student work into groups by quality. Use this to develop a rubric to test and fine tune. • Involve students in the process.
1. Collect samples of student work for assessment area of interest. 2. Sort work into groups (low, medium, high) and record reasons in some detail. 3. Cluster reasons into “traits” or criteria. 4. Write a value - neutral definition of each trait and scale. 5. Find examples (anchors) of student work to illustrate each scale point. Provide more than one anchor for each level. 6. Involve students. 7. Continue to refine and revise. RubricDevelopment(from Arter & McTighe, 2001)
Assessing your own rubrics… Does it measure the outcome it is supposed to? [ ] Do the criteria cover important dimensions of student performance? [ ] Is there a clear basis for assigning scores at each level? [ ] Do descriptions fit the criteria and scales they represent? [ ] Are the scale labels informative but not discouraging? [ ] Can students understand it and will it provide them with useful guidance and feedback? [ ] Is it feasible, practical and manageable for the assignment?[ ] Can it be applied consistently by different scorers? [ ]
Don’t let the perfect be the enemy of the good.* *The Art & Science of Assessing General Education Outcomes -AAC&U
References American Association of Colleges and Universities (2005). The Art & Science of Assessing General Education Outcomes. Washington D. C.: AAC&U. Judith Arter and Jay McTighe (2001). Scoring Rubrics in the Classroom: Using Performance Criteria for Assessing and Improving Student Performance. Thousand Oaks, CA: Corwin Press, Inc. Larry Ainsworth and Jan Christinson (1988). Student Generated Rubrics: An Assessment Model to Help all Students Succeed. Orangeburg, New York: Dale Seymour Publications. Kathleen Montgomery (Winter 2002). Authentic Tasks and Rubrics: Going Beyond Traditional Assessments in College Teaching. College Teaching 50(1): 34-39.
References (Continued) Dannelle D. Stevens and Antonia J. Levi. (2005) Introduction to Rubrics. Sterling, VA: Stylus Publishing. Barbara E. Walvoord. (2004). Assessment Clear and Simple. SanFranciso: Jossey-Bass. Grant Wiggins (1998). Educative Assessment: Designing Assessments to Inform and Improve Student Performance. San Francisco: Jossey-Bass. Grant Wiggins and Jay McTighe (1998). Understanding by Design. Alexandria: Association for Supervision and Curriculum Development.