1 / 38

Introduction to Classroom Assessment :

Introduction to Classroom Assessment : . Assessment. The process of collecting information about students and classrooms for the purpose of making instructional decisions. Classroom Assessment Methods. Observations – rating forms, narrative descriptions, checklists, logs and anecdotal notes

jeri
Télécharger la présentation

Introduction to Classroom Assessment :

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Classroom Assessment:

  2. Assessment The process of collecting information about students and classrooms for the purpose of making instructional decisions.

  3. Classroom Assessment Methods • Observations – rating forms, narrative descriptions, checklists, logs and anecdotal notes • Performance samples – work products, artifacts • Tests – informal reading inventories, end-of-unit tasks, teachers’ quizzes

  4. Assessment Decision Cycle for Student Learning Assessment Reflection Instructional Design Instruction

  5. ASSESSMENT FUNCTIONS • Determine students’ needs, interests, and current knowledge/skills • Make instructional decisions • Monitor instruction to provide teacher and students with feedback and progress information • Evaluate student outcomes and performance • Accumulate a body of evidence of student achievement • Evaluate unit outcomes and overall programs

  6. Levels of assessment integrated into unit planning • Pre-Assessment: Prior to building the unit – • What do students already know, do and what are their attitudes/dispositions? • What do students need to know, do, and be like? • Determine students’ prior knowledge and experiences • Determine students’ needs, interest • Embedded Assessments: During the unit – • Linked to each lesson plan • Determine students’ progress • Confirm or modify instructional decisions • Post-Assessment: At the end of the unit • What do the students know, do and what are their attitudes/dispositions? • Did student reach their targeted learner outcomes? • Did your unit work? • Where do we go from here?

  7. Informal When students are evaluated on a daily and informal basis using observations, anecdotal notes and checklists Formal When students are evaluated through precise and thorough quizzes, written tests or alternative assessment Formal and Informal Evaluation

  8. Quality of Assessments • Valid– measures what it claims to measure • Reliable – produces dependable, consistent scores for persons who take it more than once over a time period • Objective – eliminates biases, prejudgments and personal feelings

  9. Purposes of Classroom Assessment • Diagnostic – used at the outset of a unit, semester or year to identify problems and assess prior knowledge • Feedback – used during instruction to provide corrective feedback to students • Reporting – used at the end of a unit or semester to determine progress or grades and make judgments about student achievement

  10. General Principles of Teacher-Made Tests • Assess all instructional objectives • Cover all cognitive domains • Use appropriate test items • Make tests valid and reliable • Use tests to improve learning

  11. Constructing Classroom Tests • Begin with least difficult questions • Make tests items reflect instructional objectives and content taught • Watch vocabulary of the test itself • Make it possible for everyone to demonstrate what they have learned • Make test directions clear • Place all items of the same type together • Include all the information and materials students will need for the test • Include several test items for each objective • Make more items than you will need and use the best ones • Design questions that use both high and low cognitive levels • Use tests to improve learning

  12. Types of assessment tools • Informal and student centered measures • Selected response measures • Constructed response measures • Performance and portfolio measures

  13. Informal and student centered measures • KWL Charts • Show what you know charts • Self-assessment measures such as graphs of progress toward acquisition of standards • Paper and pencil measures such as tickets to leave, journal entries, log entries, question of the day

  14. Teacher observation data recorded anecdotally (in the form of notes) or more discrete measures (frequency counts, absences, duration, fluency) • Subject specific assessment guided by teaching and learning (use of problem solving process, use of scientific method)

  15. Selected response measures • Used to ascertain students’ mastery of larger domains of content • Measure only lower-order kinds of cognitive capabilities

  16. Types of Selected Response Items • True & False • Matching • Multiple-choice • Completion/sentence stems

  17. When To Use Types of Written Questions • Fill-in-the Blanks – when measuring students’ abilities to recall factual information • Multiple-Choice – when measuring objective information of either factual or higher-level analytic skills • Matching – when measuring student recall of a fairly large amount of factual information • True/False – when the content calls for students to compare alternatives • Short answer – when measuring higher-level analytic skills • Essay – when measuring higher-level thought processes and creativity

  18. True and False • 50% chance of correct guess • Keep a balance between true and false • Avoid broad generalizations (never, always) • Use clear language and avoid using terms denoting degree (large, long time, regularly) • Avoid using negative statements • Underline the word that makes it true or false • Encourage revision of statements that are false • Keep true and false items the same length

  19. Multiple Choice • Never use all of the above or none of the above • Avoid negatively stated stems • Distribute the order of correct answers randomly • Make the wording simple and clear • Use appropriate distracters • Make sure there is only one correct answer • Make sure all distracters are plausible • Use either sentence stems or questions • Separate the stem from the possible answers • Use three to four possible responses

  20. Matching • Include no more than 10 items to be matched • Make the phrases in the descriptors list longer than the phrases in the options list • Put definitions on the left and words on the right • Make directions clear on how to match • Underline the key word (person, place, etc) • Make sure all options are plausible distracters • Specify in the directions whether options can be used more than once • Put it all on a single page

  21. Completion • Provide a single-word answer or a brief, definite statement • Supply enough context to give it meaning • Omit insignificant words • Avoid textbook language • Provide clues, if necessary • Provide enough blanks for each word • Put the blank toward the end • Allow students to use a word or sentence bank • Provide first letter clues

  22. Constructed response items • Elicit responses more closely approximating the kinds of behavior students must display in real life • Require students to perform

  23. Types of constructed response items • Short answer • Essay

  24. Short answer • Call for students to supply a work, phrase or a sentence in response to either a direct question or an incomplete statement • Suitable for assessing relatively simple kinds of learning outcomes such as those focused on students’ acquisition of knowledge • Students have to produce a correct answer, not just recognize it • More difficult to score

  25. Tips for short answer • Employ direct questions rather than incomplete sentences • Nurture concise responses with short blanks • Limit to one or two blanks • Provide first letter clues when necessary

  26. Essay • Gages a student’s ability to synthesize, evaluate and compose • Difficult to score • Restricted-response item limits the form and content of the response • Extended response item provides students with more latitude in responding

  27. Tips for essay construction • Make the wording of a question as clear as possible • Provide guidance on how students should use their time • Write a sample answer ahead of time and assign points to various parts of the answer • Have students justify their answers • Allow students more time, if needed • Provide sentence stems or word banks • Use holistic scoring

  28. Tips for essay scoring • Score responses holistically and or analytically • Holistically has general criteria • Analytically has degrees of acceptability for each criteria • Prepare tentative scoring key in advance of judging students’ responses • Score all responses to one item before scoring responses to the next item • Evaluate items anonymously • Decide on the importance of mechanics

  29. Performance Assessments • The backbone for post-assessments of units • Designed to promoted enduring understanding • Tied to real-life, authentic, functional activities • Experientially-based • Age appropriate • Differentiated • Content • Product • Process • Clear criteria for performance

  30. Types of Performance Assessments • PERFORMANCE ASSESSMENT - Requires students to demonstrate that they can perform tasks. • AUTHENTIC ASSESSMENTS - Requires students to apply and extend what they know or can do in relation to a significant and engaging problem or question about real life. • PORTFOLIOS – Is a purposeful collection of student work that exhibits a student’s effort and achievement over a period of time.

  31. Performance Assessment/Tasks • Written work like lab reports, book reports, research papers, journals, etc. • Oral work like class discussions, panels, debates, simulations, games, etc. • Performances like speeches, role playing, presentations of visual materials, etc.

  32. Authentic Assessments • Challenge students to: • Tackle project work regularly and frequently • Judge their own work • Collaborate and converse with others • Distinguish a real audience for their work beyond the classroom teacher • Continue their learning and development over time • Understand what it means to do better

  33. Rubrics • Specify varying levels of quality for a specific assignment • Usually used with complex, long-term, performance-based assignments or assessments • Have two features: • Specifies what counts – the criteria • Illustrates gradations in the quality of work, using descriptors for strong, middling, and problematic student work.

  34. Why Rubrics? • Easy to explain • Supports learning of meta-cognition through self-assessment, monitoring, and self-management (Goodrich, 1996) • Provides students with feedback about strengths and areas for improvement • Supports development of specific skills (e.g., writing - Andrade, 1999)

  35. How do you develop a rubric? • Deconstruct the complex, final performance into subsets of skills • With students, look at models of good/poor work, work with students to determine what differentiates one from another • List the criteria (what counts), considering level of content understanding, process skills, standards, technology, format, etc. • Pack and unpack the criteria until you can formulate and create the categories to be judged (format, organization, items, etc.) • Generic form • Kid-friendly language • Articulate levels of quality (Yes, Yes but, No but, No)

  36. Sample Criterion: Briefly summarize the plot of the story • Yes, I briefly summarized the plot using significant details. • Yes, I summarized the plot but, I included some unnecessary details or left out key information. • No, I didn’t summarize the plot, but I did include some details from the story. • No, I didn’t summarize the plot.

  37. Internet-based resources for ready-made or adaptable rubrics: • http://www.teach-nology.com/web_tools/rubrics/ • http://www.idecorp.com/assessrubric.pdf • http://landmarks4schools.org/classweb/tools/rubric_builder.php3 • http://school.discovery.com/schrockguide/assess.html • www.http://pblchecklist.4teachers.org/ • http://rubistar.4teachers.org/index.php • http://ncsu.edu/midlink/ho.html • http://www.odyssey.on.ca/%7Eelaine.coxon/rubrics.htm • http://www.rubrics4teachers.com/

  38. Packaging the test • Group all items of similar format together • Arrange items from easy to hard • Space the items for easy reading • Keep items and options on the same page • Position illustrations near descriptions • Decide whether to use a separate answer sheet • Check test directions • Provide space for name and date

More Related