1 / 38

MODULE 3

MODULE 3. 3rd. 2nd. 1st. The Backward Design. Learning Objectives. What is the purpose of doing an assessment? How to determine what kind of evidences to look for? What kind of methods can be used? When ? How to create assessment tasks and evaluation criteria ?

arien
Télécharger la présentation

MODULE 3

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MODULE 3 3rd 2nd 1st

  2. The Backward Design

  3. Learning Objectives • What is the purpose of doing an assessment? • How to determine what kind of evidences to look for? • What kind of methods can be used? When? • How to create assessment tasks and evaluation criteria? • How to make sure the assessment is valid and reliable?

  4. Why to assess? The assessment purpose is to measure understanding, not to generate grades!!!! Provide professor with: • Reliable information to infer about student learning • Feedback to improve their teaching methods Provide students with: • Feedback on how well they understand the content • Feedback to improve their learning

  5. How to create assessments? Assessment Objectives 1 Evidences of Learning 2 Assessment 3 Evaluation Criteria 4 Validity and Reliability 5

  6. Assessment Objectives 1 Worth to be familiar with Superficial knowledge Important to know and do Big Ideas Core Concepts

  7. Evidences of Learning 2 Related to the ABILITY of doing something EVIDENCE refers to something that can be DEMONSTRATED! Know Concepts, Definitions Worth to be familiar with Micro-descriptive Superficial knowledge Ability to apply a specified framework to contexts approached in class Important to know and do Big Ideas Micro, domain-specific Core Concepts Ability to transfer knowledge to different contexts Macro, across domains, multi-disciplinary

  8. Evidences of Learning 2 Bloom 2001 Judge results of concepts application and make decision about the quality of the application Apply concepts to situations different from the ones approached in class. Create new application or interpretation of the concepts Break concepts into parts and understand their relationship Apply concepts to situations similar to the ones approached in class Summarize ideas, explain concepts Recall Definitions

  9. Assessment 3 Assessment Tasks When to assess? Which Method?

  10. When to assess? 3 Snapshot vs Photo Album Summative Formative + Summative

  11. Formative and Summative Assessment 3 Both are necessary! At least 50% of each! Formative Summative • More focused on grade • End of the grading period. There is no opportunity to adjust and show improvement • Objective is give feedback to students • Build learning • Students can adjust Combination of both leads to a good result! F S F

  12. Continuous Assessment 3 Different Moments and Different Methods!!

  13. Assessment Tasks 3 Worth to be familiar with Traditional Quizzes and Tests Superficial knowledge Important to know and do • Paper-and-pencil • Multiple-Choice • Constructed response Big Ideas Core Concepts Performance and Task Projects • Complex • Open-Ended • Authentic Adapted from “Understanding by Design”, Wiggins and McTighe

  14. Assessment Tasks 3 Bloom 2001 Result of Analysis - Decision Pros vs Cons, Cost vs Benefits, Reflection Complex Performance Task Authentic Tasks Application to new contexts and situations, create artifact or project Analytical Task Experiments, Scenarios Simulation, Cases Simple Performance Task Straightforward application, Exercises Open-Ended Questions Quizzes and Traditional Tests Ask about definition

  15. Authentic Task 3 Task that reflects possible real-world challenges It is a performance-based assessment! • Is realistic contextualized • Replicates key challenging real-life situations • Requires judgment and innovation • Students are asked to “do” the subject • Assesses students ability to integrateconcepts and ideas • Gives the opportunity to practice and get feedback It is problem-based NOT an exercise! From “Understanding by Design”, Wiggins and McTighe

  16. Authentic Task vs. Exercise 3 Authentic Task Exercise • Question is “noisy” and complicated • Variousapproaches can be used • Right approach • Integration of concepts and skills • Appropriate solution • Right solution and answer • Arguments is what matters • Accuracy is what matters Out of class, summative In Class, Formative From “Understanding by Design”, Wiggins and McTighe

  17. How to formulate an Authentic Task? 3 G What is the goalof the task? What is the problem that has to be solved? oal R What is the student role? What students will be asked to do? ole A Who is the audience? Who is the client? Who students need to convince? udience What is the situation or the context? What are the challenges involved? S ituation P erformance S tandards From “Understanding by Design”, Wiggins and McTighe

  18. Evaluation Criteria 4 Must… • Provide feedback for students • Be clear • Communicated in advance • Be consisted of independent variables • Focus on the central cause of performance • Focus on the understanding and use of the Big Idea

  19. Types of Evaluation Criteria 4 Criteria Check List

  20. Check List 4 There are two types of Check Lists 1. List of questions and their correct answers 2. List of individual traits with the maximum points associate to each of them

  21. Check List: Questions and answers 4 This type is used to Multiple-choice, True/False, etc. In other words, where there is a correct answer • A • C • D • B • B • D

  22. Check List: Traits and their value 4 Trait 1 Weight (%) or points Performance Trait 2 Weight (%) or points Trait ... Weight (%) or points Grade = weighted average or Grade = sum of points

  23. Analytic Rubric is better 4 • Provides more detailed feedback for students • Provides students with information about how they • will be evaluated • Is clearer • Evaluates independently each characteristic that • composes performance On the other hand… • Holistic Rubric is used when it is required only an overall impression

  24. Analytic Rubrics 4 How to create them?

  25. How to create Analytical Rubrics? 4 Example: a simple rubric to evaluate an essay Levels of achievement Excellent Satisfactory Poor Traits Ideas Organization Grammar

  26. It can be created from a Check List! 4 The difference is that each trait is broken down into levels of achievement, which have detailed description! Excellent Trait 1 Weight (%) or points Acceptable Unacceptable Excellent Performance Trait 2 Weight (%) or points Acceptable Unacceptable Excellent Trait ... Weight (%) or points Acceptable Unacceptable

  27. How to define traits? 4 It can be defined based on experience or on historical data: Get samples of students’ previous work 1. Classify the sample into different levels (strong, middle, poor…) and write down the reasons 2. Cluster the reasons into traits 3. Write down the definition of each trait 4. Select among the samples the ones that illustrate each trait 5. Continuously refine the traits’ definitions 6. It can also be defined based specific objectives and learning questions From “Understanding by Design”, Wiggins and McTighe

  28. How to build Analytic Rubric? 4 The following website is a free tool that helps to create rubrics http://rubistar.4teachers.org/index.php

  29. Validity and Reliability 5 Validity Reliability

  30. Validity and Reliability 5 Target Desired understandings / objectives Shots Assessment Outcomes http://ccnmtl.columbia.edu/projects/qmss/images/target.gif

  31. Checking for Validity 5 Self-assess the assessment tasks by asking yourself the following questions: • Is it possible to a student do well on the assessment task, but really not • demonstrate the understandings you are after? • Is it possible to a student do poorly, but still have significant understanding of the • ideas? Would this student be able to show his understanding in other ways? If yes, the assessment is not valid. It does not provide a good evidence to make any inference (Note: for both questions, consider the task characteristics and the rubrics used for evaluation) Adapted from “Understanding by Design”, Wiggins and McTighe

  32. Checking for Validity 5 The previous questions can be broken down into more detailed questions: How likely is that a student could do well on the assessment by: • Making clever guesses based on limited understanding? • Plugging in what was learned, with accurate recall but limited understanding? • Making a good effort, with a lot of hard work, but with limited understanding? • Producing lovely products and performance, but with limited understanding? • Applying natural ability to be articulated and intelligent, but with limited • understanding? Next Slide From “Understanding by Design”, Wiggins and McTighe

  33. Checking for Validity 5 How likely is that a student could do poorly on the assessment by: • Failing to meet performance goals despite having a deep understanding of the Big • Ideas? • Failing to meet the grading criteria despite having a deep understanding of the Big • Ideas? Make sure all the answers are “very unlike” !!! From “Understanding by Design”, Wiggins and McTighe

  34. Checking for Reliability 5 Assess rubric reliability by asking: • Would different professors grade similarly the same exam? • Would the same professor give the same grade if he grades the test twice, but at • different moments? Assess task reliability by asking: • If a student did well (or poorly) in one exam, would he do well (or poorly) in a similar • exam? Task reliability can be achieved by applying continuous assessments From “Understanding by Design”, Wiggins and McTighe

  35. Summary Observable, Demonstrable Learning Objectives Evidences of Learning

  36. Summary Evidences of Learning Time Formative Assessment Tasks Summative Assessment Task • Complexity depends on the desired level of understanding • Clear evaluation criteria (Rubrics) • Task and criteria must provide accurate and consistent judgments

  37. Learning Objectives • What is the purpose of doing an assessment? • How to determine what kind of evidences to look for? • What kind of methods can be used? When? • How to create assessment tasks and evaluation criteria? • How to make sure the assessment is valid and reliable?

  38. References • The main source of information used in this module is the following book • Wiggins, Grant and McTighe, Jay. Understanding by Design. 2nd Edition. ASCD, Virginia, • 2005. • Rubrics • http://rubistar.4teachers.org/index.php

More Related