1 / 44

Module 7 Assessing learning.

Module 7 Assessing learning. TED 367 Methods in Sec. Ed. Module 7. Explain how teachers might evaluate learning using traditional and alternative methods of assessment. Reading. Read the following in the Duplass textbook: Unit 7 ( topics 37-40) Assessing Student Learning.

trisha
Télécharger la présentation

Module 7 Assessing learning.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Module 7Assessing learning. TED 367 Methods in Sec. Ed.

  2. Module 7 Explain how teachers might evaluate learning using traditional and alternative methods of assessment.

  3. Reading • Read the following in the Duplass textbook: • Unit 7 (topics 37-40) Assessing Student Learning

  4. Purpose of Assessment • To determine the knowledge needs of students. • To provide learners with information on their progress. • To improve teaching, learning, and remediation. • To provide a basis for assigning grades and for making decisions about promoting students to the next grade level.

  5. Types of Assessment • Summative: Evaluate progress made at the end of a defined period of instruction. • Formative: According to PDE, classroom based assessment that allows teachers to monitor and adjust their instructional practice to meet individual student needs. Can be formal or informal, depending on use.

  6. Types of Assessment • Diagnostic: Evaluate student strengths, weaknesses, knowledge, and skills, prior to instruction. Allows instructor to remediate students and adjust curriculum to meet student needs. • Benchmark: Provide feedback to teachers and students about progress towards demonstrating proficiency on grade level standards.

  7. PDE Standards Aligned System • Based on research, PDE has defined that great schools and school systems tend to have six common elements: • Clear Standards • Fair Assessments • Curriculum Framework • Instruction • Materials and Resources • Interventions

  8. PDE Standards Aligned System PDE SAS Standards Aligned System

  9. Vary Methods • Just as excellent teachers accommodate diverse learners by using a variety of teaching strategies, so they need to provide a variety of vehicles for students to demonstrate their knowledge.

  10. Issues in Assessment • Subjectivity vs. objectivity. • Increasing concern for teachers to be more objective, even when using subjective tests. Parents want to see “evidence” of how their child is performing. • “Defensive” testing. • Too great a reliance on one form of assessment. • Halo effect: Teachers unknowingly grade some students more favorably than is warranted. • Alternative assessment vs. Traditional assessment

  11. Feedback in Assessment • Feedback is the centerpiece of assessment because it leads to intervention and reduces learning deficiencies. • Just telling students their performance level (grade) does not necessarily lead to change and improvement.

  12. Feedback in Assessment • Rubrics • Checklists • Pretests or Pre-assessment • Practice Tests • Assessment Progress Reports • Informal Feedback • Written Assessments • Interviews • Peer Assessment • Self-Assessment • Class Review

  13. Rubrics • The use of rubrics is one of the best ways to help students succeed at tasks and master content. • Powerful: Transfer responsibility for learning to students by creating a process that focuses on students’ motivation and self-evaluation. • Clearly identifies what a student needs to do: • Define required components of a task. • Define performance standards they should meet. • List point values that will be used to evaluate performance.

  14. Analytic rubrics: Articulate a level of performance for each criterion. Holistic rubrics: Articulate a total level of performance. Level of performance is defined by descriptors. A checklist is often less detailed than a rubric and thus requires more imagination and self-initiative on the part of the student. Types of Rubrics Rubrics & Checklists should always be given to students before they begin the task assignment.

  15. Rubistar • Teachers can make rubrics online for free at Rubistar: http://rubistar.4teachers.org/index.php • If you create a user name and password, you can save and edit rubrics. • You can search through existing rubrics made by other teachers. Rubistar

  16. Alternative Assessment Authentic Assessment Product Assessment Performance Assessment

  17. Authentic Assessments • Product simulates a real-world activity. • Assessment is preceded by relevant instruction on content and form. • Assessment is both an evaluation and a learning experience. • Performance/product allows for individual creativity, interests, and strengths. • Students are expected to develop the information necessary for the assessment activity with a minimum of teacher supervision. • Students progress at different rates to produce high-quality results. • Scoring of the product/performance is consistent, but it allows for individual differences and creativity. • Feedback addresses strengths and weaknesses.

  18. Portfolios Poster boards and collages Fact-findings Response journals Letters Word webs Idea lists Headlines Newspaper articles Chalkboard journals Proposals Ideas of the week Product Assessment

  19. Performance Assessment • The two most common methods used for performance assessment are: • Questioning. • Observation.

  20. Traditional Assessment

  21. Traditional Assessment • Traditional assessment refers to quizzes and tests that are often objective. • The popularity of traditional tests has a great deal to do with the ease with which they can be scored and the ease of comparing the quantitative, relatively objective scores across a class.

  22. Essay and Short Answer Exams • Advantages: They are usually easy to construct; eliminate guessing; require recall of knowledge; and support a teacher’s language arts goals by letting students organize Information Knowledge and express it in a unique form. • Disadvantages: They take more time to score; are more subjective; and usually take more time to administer than true-false tests. • Lead Phrase Approach • Mixed Vocabulary Approach

  23. Essay and Short Answer Exams • Create a rubric and a list of expected correct answers or components of answers to reduce subjectivity. • Score one question at a time for all the students, rather than all questions for one student. • Take off points for errors in spelling, handwriting, and grammar; give positive points for correct content. • Have some questions that are required of all students and some from which students can select, but do not have optional bonus questions.

  24. Fill-in-the-Blank andCompletion Test Items • These typically require simple recall. • Advantages: They are usually easy to construct; eliminate guessing; require recall of knowledge; support spelling skills; and are relatively easy to grade. • Disadvantages: They take more time to administer and to score than true false tests.

  25. Fill-in-the-Blank andCompletion Test Items • Create an answer sheet of correct answers. • When testing for definitions, put the term in the question, and require students to supply the definition. • There should be no more than two blanks per question. • Take points off for spelling and handwriting errors, and give positive points for correct content. • Do not take sentences directly from the textbook.

  26. True-False Tests • May be the most widely used tests because they are so easy to construct and grade. • Advantages: They can cover a lot of material in a short period of time; are easy to score; provide quantitative comparative scores for students; and are relatively uncontestable. • Disadvantages: They encourage guessing; can be poorly phrased and confusing; tend to focus on facts, although care can be taken to create higher-level true-false questions; and because key terms must be used in the item, there is little recall.

  27. True-False Tests • Have no more than one concept in a question. • For some questions, require additional written elaboration (this is a modified true-false item). • Since most students guess “true,” make more than half of the questions false. • Avoid clues such as “all,” “never,” and “only.” • Avoid double negatives.

  28. True-False Tests • Avoid complex and compound sentences. • Avoid giving clues in the choice of grammar. • Avoid the trivial. • Do not take wording directly from the text. • Make the test an application by providing a reading, problem, diagram, graph, or image and having students answer true or false based on their Procedural Knowledge.

  29. Multiple-Choice Tests • Have a distinct advantage over true-false tests in that they require students to choose among multiple answers. • Advantages: Same as true-false; work well for concepts with closely related potentially correct answers; reduce guessing possibilities; and can be constructed to require the choice of the “best” answer as well as the only correct answer. • Disadvantages: Require little recall because key terms are included as options; time and skill are needed to construct plausible wrong answers; and they tend to focus on facts, although higher-order test items can be created.

  30. Multiple-Choice Tests • Stems (the phrases before the possible answers) should be either questions or incomplete statements. • Make sure there is only one correct answer. • All options should be plausible. • Avoid use of “all of the above” or “none of the above.” • An approximately equal number of correct answers should appear in each position (same number of A, B, C, D answers). • Four possible answers per question is a good amount. They should be the same type: all nouns, all verb phrases, etc. • Make the test an application by providing a reading, graph, diagram, or image, and have students make choices based on their Procedural Knowledge.

  31. Matching Tests • Have a problem column and a response column. They can consist of terms and definitions, causes and effects, dates or people and events, and problems and solutions. • Advantages: Same as true-false; focus students on key ideas; reduce guessing possibilities; can be constructed to require choice of “best” answer as well as the only correct answer; and require less paper. • Disadvantages: Require no recall; and developing homogenous columns (columns with all like items, such as names on the left and what the people did on the right) takes time and skill.

  32. Matching Tests • Place like items in one column and their matching definitions or events in the other column. • Number the left column and use letters for the right column. • Make sure there is only one correct answer. • Provide more items in the response column than in the problem column. • Keep the test to one page, with fifteen or fewer items.

  33. Matching Tests • Disperse responses throughout the list. • Arrange responses in alphabetical or some other logical order. • Keep responses short. • Make the test an application by providing a reading, graph, problem, diagram, or image, and have students answer based on their Procedural Knowledge.

  34. Alternatives to Traditional Tests • Oral Exams • Classroom Jeopardy • Pupil-Produced Tests • Take-Home Tests • Crossword Puzzles • Word Scrambles • Student Response System (Activote)

  35. Grading

  36. Establishing a Policy • At the beginning of the year, have a candid talk about ethics, cheating, and plagiarism. • Review the repercussions of cheating. • You need to establish a policy. Send the policy to students’ homes in written form at the beginning of the year.

  37. Problems in Assessment There is substantial evidence that: • Students focus on points and grades to the detriment of learning from their mistakes. • Parents and students view the points and grades through a competitive lens. • Because teachers fear conflict with students and parents over their assessments, there is point and grade inflation.

  38. Best Practices for Grading • Define plagiarism and cheating and give concrete examples during your discussion of your grading practice and ethics. • Never sit down during a test, but circulate around the room. • Give two versions of a test. • Arrange desks so that students are lined in rows, not grouped. • Have students clear all materials from around their area during a test.

  39. Best Practices for Grading • If you observe a student cheating with materials, collect the materials during the test, but deal with the student after class. • If you observe a student cheating by looking at another student’s paper (“wandering eyes”), record the time and event, but deal with the student after class. • On assignments, make it clear whether students can collaborate or whether it is to be an exclusive work product. • When talking with a student about your suspicions, explain your observations without using the word cheating, listen carefully to the student’s response, keep the student’s welfare in mind, be prepared for excuses, explain your conclusion after hearing the explanation, and inform the student of the action you will take.

  40. Best Practices for Grading • Establish a grading scale, such as A 90–100; F 60–69. Typically this will be prescribed by your school or school district. • Establish categories and their weights during a grading period: for example, 100 points for participation; 100 points for correct answers from questioning; 200 points for four tests; 200 points for the authentic assessment project. • Announce the scale, categories, and weights of the assessments in writing to parents and students at the beginning of the marking period.

  41. Best Practices for Grading • Drop the lowest grade or two as an alternative to requiring makeup tests for absences. This way you do not have to create a another version of the test and you do not have to do additional scoring. • Vary the value of questions based on Bloom’s taxonomy, with the greatest weight to higher-level abilities. • Grade all papers within two days of receipt, and return them on the third day. • Shuffle the papers before scoring to avoid bias, rather than always grading students in a certain order.

  42. Best Practices for Grading • Record the grades on the back of the paper in ink so students do not know what other students’ grades are, unless they deliberately share them. • When possible, post correct answers on the board for students to review and record on their papers. • After returning papers, go over items on which all students generally did less well than expected, and answer students’ questions about other items. • As a self-assessment, have students write reasons why questions were not answered correctly on their papers. • Require parents to sign all assessments below a C grade.

  43. Best Practices for Grading • Encourage comments from parents. • Have students maintain a file folder with all their assessments. • If you make a mistake in scoring, admit it and change the grade. Otherwise, do not change grades. • Talk with a student who has done poorly in two consecutive assessments. • If the student does poorly on the next test, call the parent: Do not wait until a conference. • Use a numeric grading scheme so that you can record grades in a spreadsheet or database that automatically compiles totals, percentages, and averages.

More Related