1 / 48

Self-Assessing Locally-Designed Assessments

Self-Assessing Locally-Designed Assessments. Jennifer Borgioli Learner-Centered Initiatives, Ltd. Handouts. qualityrubrics.pbworks.com/DATAG. Organizational Focus. Assessment to produce learning… a nd not just measure learning.

guy-david
Télécharger la présentation

Self-Assessing Locally-Designed Assessments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Self-Assessing Locally-Designed Assessments Jennifer Borgioli Learner-Centered Initiatives, Ltd.

  2. Handouts qualityrubrics.pbworks.com/DATAG

  3. Organizational Focus Assessment to produce learning… and not just measure learning.

  4. “Less than 20% of teacher preparation programs contain higher level or advanced courses in psychometrics (assessment design) or instructional data analysis.” Inside Higher Education, April 2009

  5. To be assessment savvy….

  6. 1999 APA Testing Standards

  7. “The higher the stakes of an assessment’s results, the higher the expectation for the documentation supporting the assessment design and the decisions made based on the assessment results.”

  8. Performance-Based Assessments (PBAs) A performance task is an assessment that requires students to demonstrate achievement by producing an extended written or spoken answer, by engaging in group or individual activities, or by creating a specific product. (Nitko, 2001)

  9. Three Types of Measurement Error Subject effect Test effect Environmental effects

  10. Subject Effects

  11. Testing Fatigue Test Familiarity Bias Score Score

  12. Test Effects

  13. Final Eyes isn’t about editing rather “is this what you want the students to see/read?”

  14. Test from Period 1 Test from Period 2

  15. Compare with . . .

  16. Environmental Effects

  17. Reliability = Consistency

  18. Reliability Indication of how consistently an assessment measures its intended target and the extent to which scores are relatively free of error. Low reliability means that scores cannot be trusted for decision making. Necessary but not sufficient condition to ensure validity.

  19. three general ways to collect evidence of reliability Stability: How consistent are the results of an assessment when given at two time-separated occasions? Alternate Form: How consistent are the results of an assessment when given in two different forms?; Internal Consistency: How consistently do the test’s items function?

  20. Cronbach’s Alpha “In statistics, Cronbach's (alpha) is a coefficient of reliability. It is commonly used as a measure of the internal consistency or reliability of a psychometric test score for a sample of examinees. Alpha is not robust against missing data.”

  21. Item Analysis “This isn’t familiar to me”

  22. Percent of Students Selecting Choice “E”

  23. Validity = Accuracy

  24. How do we ensure alignment and validity in assessment?Degrees of Alignment

  25. If you want to assess your students’ ability to perform, design, apply, interpret. . . . . . then assess them with a performance or product taskthat requires them to perform, design, apply, or interpret.

  26. How many?3-5 3 – 5 standards in a PBA (reflected in rows in the rubric) 3 – 5 items per standard on a traditional test

  27. Minimum

  28. Basic

  29. Articulated

  30. One assessment does not an assessment system make.

  31. Fairness and Bias Fair tests are accessible and enable all students to show what they know. Bias emerges when features of the assessment itself impede students’ ability to demonstrate their knowledge or skills.

  32. In 1876, General George Custer and his troops fought Lakota and Cheyenne warriors at the Battle of the Little Big Horn. In there had been a scoreboard on hand, at the end of that battle which of the following score-board representatives would have been most accurate? Soldiers > Indians Soldiers = Indians Soldiers < Indians All of the above scoreboards are equally accurate

  33. What are other attributes of quality assessments?

  34. Standard Error of Measurement An estimate of the consistency of a student’s score if the student had retaken the test innumerable times

  35. WHEN DESIGNING A PRE/POST PERFORMANCE TASK • the standards and thinking demands must stay the same. • the modality that students express their thinking through must also stay the same. • the content of the baseline and post must be different. • the rubrics for the pre/post will be the same in terms of thinking and modality, but the content dimension will be different.

  36. Jennifer Borgiolijenniferb@lciltd.org@datadiva

More Related