350 likes | 499 Vues
Iowa’s Application of Rubrics to Evaluate Screening and Progress Tools. John L. Hosp, PhD University of Iowa. Overview of this Webinar. Share rubrics for evaluating screening and progress tools Describe process Iowa Department of Education used to apply rubrics. Purpose of the Review.
E N D
Iowa’s Application of Rubrics to Evaluate Screening and Progress Tools John L. Hosp, PhD University of Iowa
Overview of this Webinar • Share rubrics for evaluating screening and progress tools • Describe process Iowa Department of Education used to apply rubrics
Purpose of the Review • Survey of universal screening and progress tools currently being used by LEAs in Iowa • Review these tools for technical adequacy • Incorporate one tool into new state data system • Provide access to tools for all LEAs in state
Collaborative Effort The National Center on Response to Intervention
Overview of the Review Process • The work group was divided into 3 groups: • Within each group, members worked in pairs
Overview of the Review Process • Each pair: • had a copy of the materials needed to conduct the review • reviewed and scored their parts together and then swapped with the other pair in their group • Pairs within each group met only if there were discrepancies in scoring • A lead person from one of the other groups participated to mediate reconciliation • This allowed each tool to be reviewed by every work group member
Overview of the Review Process • All reviews will be completed and brought to a full work group meeting • Results will be compiled and shared • Final determinations across groups for each tool will be shared with the vetting group two weeks later • The vetting group will have one month to review the information and provide feedback to the work group
Structure and Rationale of Rubrics • Separate rubrics for universal screening and progress monitoring • Many tools reviewed for both • Different considerations • Common header and descriptive information • Different criteria for each group (a, b, c)
Universal Screening Rubric Header on cover page
Judging Criterion Measure Additional Sheet for Judging the External Criterion Measure (Revised 10/24/11) • An appropriate Criterion Measure is: • External to the screening or progress monitoring tool • A Broad skill rather than a specific skill • Technically adequate for reliability • Technically adequate for validity • Validated on a broad sample that would also represent Iowa’s population
Key + = proficiency/mastery - = nonproficiency/at-risk 0 = unknown = Sensitivity = Specificity Sensitivity and Specificity Considerations and Explanations Explanations: True means “in agreement between screening and outcome”. So true can be negative to negative in terms of student performance (i.e., negative meaning at-risk or nonproficient). This could be considered either positive or negative prediction depending on which the developer intends the tool to predict. As an example, a tool that has a primary purpose of identifying students at-risk for future failure would probably use ‘true positives’ to mean ‘those students who were accurately predicted to fail the outcome test’. Sensitivity = true positives/true positives + false negatives Specificity = true negatives/true negatives + false positives
Consideration 1: Determine whether developer is predicting a positive outcome (i.e., proficiency, success, mastery, at or above a criterion or cut score) from a positive performance on the screening tool (i.e., at or above benchmark or a criterion or cut score) or a negative outcome (i.e., failure, nonproficiency, below a criterion or cut score) from negative performance on the screening tool (i.e., below a benchmark, criterion, or cut score). Prediction is almost always positive to positive or negative to negative; however in rare cases it might be positive to negative or negative to positive.
Progress Monitoring Rubric Header on cover page Descriptive info on each work group’s section
Findings • Many of the tools reported are not sufficient (or appropriate) for universal screening or progress monitoring • Some tools are appropriate for both • No tool (so far) is “perfect” • There are alternatives from which to choose
Live Chat • Thursday April 26, 2012 • 2:00-3:00 EDT • Go to rti4success.org for more details