1 / 64

Response to Instruction and Intervention: A Standards-aligned System for Student Success

Response to Instruction and Intervention: A Standards-aligned System for Student Success. Universal Screening. Training Outcomes. List the characteristics of rigorous universal screening tools. Discuss the logistics needed for efficient universal screening.

ishi
Télécharger la présentation

Response to Instruction and Intervention: A Standards-aligned System for Student Success

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Response to Instruction and Intervention: A Standards-aligned System for Student Success Universal Screening

  2. Training Outcomes • List the characteristics of rigorous universal screening tools. • Discuss the logistics needed for efficient universal screening. • Develop an action plan for implementing universal screening leading to the analysis of screening data.

  3. Connection to Pennsylvania’s Standards Aligned System andSchool Improvement Process

  4. Pennsylvania’s Commitment to Least Restrictive Environment (LRE) Recognizing that the placement decision is an Individualized Education Program (IEP) team decision, our goal for each child is to ensure IEP teams begin with the general education setting with the use of Supplementary Aids and Services before considering a more restrictive environment.

  5. Pennsylvania’s Standards-Aligned System

  6. Multiple data sources • Balance of local and state assessments • Current student data • Prioritize areas of strength and concern • Underlying causes of current state of student achievement • Potential improvement strategies • Student achievement improvement targets • Action Sequence in 1-2-3-4-5 Steps! • Summative Assessments • Formative Assessments • Perceptual Demographic Data • Data From: • PSSA • 4Sight • PVAAS • Locally relevant assessments • Guiding questions for “root cause” analysis • Vital few research-based or promising strategies • NCLB AYP target • Action Sequence • Step 1: Data • Step 2: Design • Step 3: Delivery • Step 4: Development of People • Step 5: Documentation

  7. Level of Implementation Scales • Review Universal Screening Section of Readiness Self-assessment. 2. Determine exactly where your school falls in implementation of universal screening.

  8. High quality, effective instructionin the general education curriculum • Data Analysis Teaming • Universal Screening • Progress Monitoring • 4Sight Benchmark Assessments • Clear and high expectations for student learning and behavior • Support to enhance student engagement and to promote school completion Tier 1 Foundation-Standards Aligned Instruction for All Students

  9. Assessments • Screening • Diagnostic • Progress Monitoring • Outcome Measures

  10. Universal Screening • Assists in identifying grade-wide deficits in curriculum and instruction. • Provides a baseline for grade-wide goal setting. • Identifies students at risk of academic or behavioral difficulties. • Can generate local norms and benchmarks.

  11. Universal Screening Expectations • Conduct universal screening of critical academic skills in reading and math at all grade levels • Use screening practices that are predictive of performance on standards, efficiently administered, and sensitive to growth • Conduct universal screening of behavior at all grade levels

  12. Universal Screening Expectations • Screenings conducted on all students three times per year • School maintains results of screening in a database • School produces user-friendly summaries of screening data: • A graph is completed to display data for analysis and decision-making and to indicate percentage of students at-risk, at some risk, and at low risk

  13. Universal?? • Universal=all students should be screened for academic and behavioral difficulties. • all academic-all those students who are targeted for PSSA assessment. Includes K-3 and 9-11 in reading and math. Students eligible for PASA need not be screened. • all behavioral=all students

  14. Screening?? • Screening: • must be valid, reliable and based on scientifically-based research. • is brief procedure designed as a first step in identifying children at high risk for academic failure. • Reading First Statute

  15. Why Screen?? First graders in the bottom quartile in reading have an 88% likelihood of placing in the bottom quartile in 4th grade and a 78% likelihood of remaining there through 8th grade. Juel 1988

  16. Choices • Early Intervening-actively seek out students at risk of difficulty and intervene immediately prior to long term failure and the need for intensive supports. OR • Wait for long-term failure greatly increasing the need for intensive interventions including special education.

  17. Characteristics of a Quality Screening Instrument • Must be brief and easily administered. • Must be research-based • Must be highly correlated to skills assessed • Must have benchmarks or be predictive of future performance • Must have high reliability and validity. • Must be sensitive to small increments of change • National Center for Progress Monitoring

  18. Characteristics (2) • Alternate forms available • Screening • Progress Monitoring • Rates of improvement specified • Data analysis and reporting available • Leads to teacher or student change

  19. Outcome Measures: Characteristics • They are simple, accurate, and reasonably inexpensive in terms of time and materials. • They are so important, they become routine. • They are collected on an ongoing and frequent basis. • They shape/inform a variety of important decisions. • edformation

  20. Outcome Measures (2) • Standardized, Reliable, & Valid • Index growth in general curriculum over time and across a wide range of skills • May or may not be measuring directly the curriculum of instruction • Do suggest when instructional modifications are needed • Do not specifically suggest instructional modification

  21. Screening Selection Worksheet • Areas screened • Reading • Math • Behavior • Frequency • Minimum of 3 times per year

  22. Reliability/Validity Best Practice-third party analysis of the screening tool for: • Reliability • Validity • Research-base www.studentprogress.org Vendor research reports should be viewed carefully, especially those supporting new products or new test types.

  23. Training Training in Administration • On-line, videotape, live trainers? • Practice? • Scoring and uploading? • Fidelity Checks? • Use of database? • Interpretation of reports Is additional training in the scope and sequence or instructional base tested needed by the staff?

  24. Time Administration time: • Individually administered tests • Number of subtests required per grade (Grade 4 may have 1 but K may have 4) • Length of subtest (1 minute reading or 3 one minute readings?) • Group administered tests • Reading Maze, math CBMs (3-45 minutes per class per subject)

  25. Time Additional considerations • Scoring time • Open-ended questions=Rubric Scoring • Number of items • Teacher vs. para-educator scoring • Entry of scores into database • Hand entry • Scanning • Other electronic options

  26. Student School District Class Subject Eligible content Sub-group Standard Across grades Across time Following cohorts By assessed skill Ability to generate novel queries. Turnaround time User friendliness Data Report Concerns

  27. Alternate Forms • Benchmark • Minimum of three • Progress Monitoring • 20-30 for frequent monitoring of students at risk of deficit in tested skill area.

  28. Benchmarks • Benchmarks reflect proficiency at one point in time and are predictive of a student’s performance on the next benchmark. • A students reading 40 words correct per minute in the spring of 1st grade is likely to hit the 2nd grade spring benchmark of 90 wcpm. • Look for benchmarks in major skill areas

  29. Cost • Materials • Training (Time and materials) • Administration • Substitutes, etc. • Scoring • Data entry • Hardware/software costs (Scanner?) • Fee for use of database • Preparation of data packets • Recurring costs

  30. Frequently Used in PA • Dynamic Indicators of Early Literacy Skills • DIBELS www.dibels.uoregon.edu • AIMSweb • www.AIMSweb.com • 4Sight Benchmark Assessments • www.successforall.net • Monitoring Basic Skills Progress • MBSP-www.proedinc.com • School-wide Information Systems • SWIS-www.swis.org

  31. Let’s Look! National Center on Student Progress Monitoring www.studentprogress.org ‘Tools’

  32. Screening Options for Behavior: Elementary – Middle School Informal • Observations • Teacher Nomination List • Based on a standardized checklist • Discipline referrals/actions • Attendance records

  33. Screening Options for Behavior: Elementary – Middle School Formal • www.casel.org[Resource] • Promoting Alternative Thinking Strategies (Greenberg et al) • I Can Problem Solve (Shure) • Project ACHIEVE (Knoff) • Stop/Think Social Skills Curriculum

  34. Selection to Implementation Plan Plan Plan Plan Plan Plan Plan Plan • Planning begins with the thoughtful selection of the screening tools. • Time spent in assessment must be offset by the instructional value of the data obtained. Plan Plan Plan Plan

  35. Screening Planning Worksheet Selecting and Purchasing • Define selection process? • Who is involved? • By when will the decision be made and materials available? • Remember recurring costs!

  36. Staff Training • Administration of measures • When, where, by whom? Teachers, Paras? • Interpretation of measures? • When, where, by whom? • Training in data analysis teaming? • When, where, by whom? • Remember the possible need for unanticipated training due to instructional or curricular issues identified during screening.

  37. Material Preparation Who? When? How? • Download • Copy • Collate • Color Code • Demographics (Name, etc) • Distribute • Technology Preparation

  38. Screening • Determine and notify staff of all timelines for the three administrations of the measures. • Administering • Scoring • Data entry • Follow- up data analysis team meetings • Follow-ups to follow-up team meeting

  39. Screening-WHO? All approaches may be utilized depending on number of students, classes, grades, and number of subtests required. • Classroom or grade approach • ‘SWAT’ Team Approach • Modified ‘SWAT’ Team Approach

  40. Classroom or Grade Approach • Classroom or Grade Approach: • The teacher or teachers and their support staff screen all students in a classroom or grade.

  41. Advantages Teachers test their own students. Students are tested within the classroom routine. Excellent for group administered tests (MBSP, Maze, 4Sight, etc). Disadvantages Time consuming for individually administered subtests (ORF, PSF, etc) Requires materials for all teachers. Logistically difficult for school-wide scoring and data entry. Classroom/Grade Approach

  42. ‘SWAT’ Team Approach • A large team of administrators, unassigned teachers, support staff, and/or trained volunteers moves through the building screening all students.

  43. Advantages Data collected quickly and efficiently Minimal disruption in classroom routine Fewer materials needed May facilitate scoring and data entry Disadvantages Teachers do not test their own students. May lead to a lack of ownership of data and data analysis. Support services may be disrupted (SLP, Title, etc.) ‘SWAT’ Team Approach

  44. Modified ‘SWAT’ Team Approach • A substitute or substitutes are added to the ‘SWAT’ team to run the classroom while the students are screened. The classroom teacher then participates in screening his/her students.

  45. Advantages Teachers screen some of their own students. Increases teacher’s understanding and ownership of screening data. Less disruptive to other classes. Disadvantages Pre-planning is required by classroom teacher. Disruptive of the classroom routine. May require more time than other options. Modified ‘SWAT’ Team

  46. Fidelity Checks When? By whom? Paramount to ensure reliability and validity of screening results. Screenings must be administered exactly as specified. Fidelity checks are often provided in commercially available products.

  47. Post Screening-Scoring Who? By When? • Hand scoring or rubric scoring of open-ended questions requires considerable additional time.

  48. Database Entry • Who? • Teacher, support staff, etc. • How? • Hand entry • SCAN • On-line • When?

More Related