1 / 41

Please check, just in case…

Please check, just in case…. Announcements. We will work on the Terminology Treasure Hunt in class in two weeks. I need a few volunteers to help me bring over materials to class next week.

fayre
Télécharger la présentation

Please check, just in case…

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Please check, just in case…

  2. Announcements • We will work on the Terminology Treasure Hunt in class in two weeks. I need a few volunteers to help me bring over materials to class next week. • Don’t delay on starting your standardized test review/critique. Please contact Brigid (brigidov@unm.edu) if you or your partner cannot come in on a Thursday 2-5, when she is scheduled in the Ed Diag office.

  3. Quick questions, quandaries, comments or concerns?

  4. APA Tip of the Day: Quoting on-line sources • Include the author – if a person is not listed, use the name of the organization (e.g. Council for Exceptional Children) • Include the year – look carefully – often a year is listed at the bottom of the page in small print. • Include the page number, paragraph number, or, if neither of these are available, the section heading for the quoted text. See APA manual, 2010, pp. 171-172

  5. APA Examples From APA manual, 2010, p. 172: • In their study, Verbunt, Pernot, and Smeets (2008) found that “the level of perceived disability in patients with fibromyalgia seemed best explained by their mental health condition and less by their physical condition” (Discussion section, para 1). Note: Discussion is the complete section heading. • “Empirical studies have found mixed results on the efficacy of labels in educating consumers and changing consumption behavior” (Golan, Kuchler, & Krissof, 2007, “Mandatory Labeling Has Targeted,” para. 4). Note: Mandatory Labeling Has Targetedis NOT the complete section heading, therefore, it has quotation marks.

  6. Topic: Norm-references vs. criterion referenced tests October 8, 2013

  7. data collection measurement evaluation testing assessment

  8. Common Purposes of Assessment • Identification of atypical learning needs (i.e. disability and/or gifted/talented). • Determination of language proficiency. • Evaluation of current academic performance. • Accountability.

  9. Prereferral Intervention This is a GENERAL EDUCATION process that should not necessarily lead to student referral to special education evaluation.

  10. Special Education Assessment Processes • Screening • Evaluation (initial) • Re-evaluation • On-going data collection (i.e. classroom-based assessment)

  11. Screening • Quick evaluation(s) in area(s) of concern. • May be administered by specialist or classroom teacher, • Typically used to rule out need for further assessment.

  12. Special Education Evaluation • Team Process • Parent Consent • Parent participation • Non-discriminatory: • cultural & linguistic bias • appropriate instruments • multifaceted assessment

  13. More useful purposes of assessment • What helps this student learn best? (Needed supports, scaffolding, cueing & prompting strategies.) • Patterns of language use by context. • Available supports for learning. • Patterns of progress toward specific learning goals. • Interaction of learning environment on learning, performance, and behavior.

  14. Important! Our assessment methods MUST match our purpose.

  15. Effective assessment means using the right tool for the job.

  16. Quick Write: Why is it important for special educators to under stand the language of testing if they probably won’t be administering any diagnostic assessments?

  17. Effective assessment means using the right tool for the job.

  18. Common Assessment Techniques • Standardized Assessments: • norm- referenced and/or • criterion-referenced • other (neither NR or CR) • Non-standardized (informal) assessments: • norm-referenced • criterion-referenced • other (neither NR or CR)

  19. Other kinds of assessments: • Developmental scales • Dynamic assessments • Language samples Additional instruments that might be used in assessment: • Family/child history • Interviews • Other documents observations

  20. Standardized Tests: Tests that are “designed by test specialists and administered, scored, and interpreted under standard conditions.” (Linn & Gronlund, 2000, p. 44) Standardized vs. informal measures

  21. Quick Question: • Are classroom-based assessments usually standardized? • Why or why not?

  22. Norm-referencedversuscriterion-referenced

  23. Norm-referenced Tests Describe “performance in terms of the relative position held in some known group (e.g., typed better than 90 percent of the class members).” (Linn & Gronlund, 2000, p. 42) NR assessments compare individual performance against others’ performance.

  24. Quick Question: • Would norm-referenced classroom-based assessments be appropriate for students identified with special education needs? • Why or why not?

  25. Criterion-referenced Assessments Describe “the specific performance that was demonstrated.” (Linn & Gronlund, 2000, p. 42) They are used to compare individual performance against a preset standard (criteria).

  26. Strengths and limitations

  27. Strengths and limitations limitations NOT weaknesses

  28. “There is no such thing as a ‘good’ or ‘bad’ test in the abstract, and… there is no such thing as the one ‘best’ test, even for a specific situation.” (Bachman & Palmer, 1996, p. 6)

  29. Six aspects of test quality: • Reliability • Construct validity • Authenticity • Interactiveness • Impact • Practicality

  30. Individual Activity: Individually, come up with at least three sentences using the word “reliable” and at least three sentences using the word “valid.” These sentences do not have to have anything to do with assessment or evaluation -- they can be the kinds of things you would say in ‘real life.’

  31. Definitions Reliability: 1) “Reliability refers to the results obtained with an assessment instrument and not to the instrument itself.” 2) “An estimate of reliability always refers to a particular type of consistency” (i.e. over time, inter-rater reliability, with different tasks).

  32. Definitions, cont. Reliability: 3) “Reliability is a necessary but not sufficient condition for validity.” 4) “Reliability is primarily statistical.” (Linn & Gronlund, 2000, pp. 108-109)

  33. Definitions, cont. Validity: 1.) “Validity refers to the appropriateness of the interpretation of the results of an assessment procedure for a given group of individuals, not to the procedure itself.” 2.) “Validity is a matter of degree; it does not exist on an all-or-none basis.”

  34. Definitions, cont. Validity: 3.) “Validity is always specific to some particular use or interpretation. No assessment is valid for all purposes.” 4.) “Validity is viewed as a unitary concept based on various kinds of evidence.”

  35. Definitions, cont. Validity: 5.) “Validity involves an overall evaluative judgment. It requires an evaluation of the degree to which interpretations and use of assessment results are justified by supporting evidence and in terms of the consequences of those interpretations and uses.” (Linn & Gronlund, 2000, pp.75-76)

  36. Definitions, cont. “Validity is an evaluation of the adequacy and appropriateness of the interpretations and use of assessment results.” (Linn & Gronlund, 2000, p. 73)

  37. Cautions related to use of “validity” "Validity refers to the appropriateness of the interpretation of the results of an assessment procedure for a given group of individuals, not to the procedure itself." "Validity is a matter of degree; it does not exist on an all-or-none basis."

  38. Cautions, cont. "Validity is always specific to some particular use or interpretation. No assessment is valid for all purposes." "Validity is a unitary concept." "Validity involves an overall evaluative judgment." (Linn & Gronlund, 2000, pp. 75-76)

  39. Main Points: 1) A test must be reliable for the interpretation to be valid. 2) Reliability, in and of itself, is not enough -- the interpretation must also be valid for the individual and specified purpose of assessment. 3) Validity refers to the interpretation of the test results, not to the test.

  40. Main Points, cont.: 4) As a special educator, the most important thing you need to learn about standardized tests is how to interpret assessment results -- are they valid FOR THIS CHILD AT THIS TIME?

  41. Please take some time for the mid-semester course evaluation.

More Related