1 / 51

Response to Intervention…. More Than Data Points The Diagnostician’s Role in the Process

Response to Intervention…. More Than Data Points The Diagnostician’s Role in the Process. Andrea Ogonosky, Ph.D., LSSP, NCSP Licensed Psychologist ESC 4 Summer Assessment Institute aogonosky@msn.com (832)656-0398. Agenda. Technical Adequacy of Process Team Membership/Leadership

Télécharger la présentation

Response to Intervention…. More Than Data Points The Diagnostician’s Role in the Process

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Response to Intervention….More Than Data PointsThe Diagnostician’s Role in the Process Andrea Ogonosky, Ph.D., LSSP, NCSP Licensed Psychologist ESC 4 Summer Assessment Institute aogonosky@msn.com (832)656-0398

  2. Agenda • Technical Adequacy of Process • Team Membership/Leadership • Multiple Sources of Data • Staff Knowledge

  3. Technical Adequacy The District Guidance Document

  4. RtI: Problem Solving Interventions Assessment Student Instructional Level Supplemental Interventions 120 min per week additional Progress Monitoring Diagnostics 5% Progress Monitoring Diagnostics Student Instructional Level Supplemental Interventions 90 min per week additional 15% Universal Screening Progress Monitoring Grade Level Instruction/ Support 80%

  5. Pair and Share • Have you ever felt like this? Why? • What has been your greatest challenge with your district RtI process? • On a scale of 1-5 where is your district as far as implementing a true problem solving process centered around Tier 1? • Do you continue to hear staff refer to RtI as a referral process or a documentation journey on the road to special education? • When are you called in to consult? • Are you a valued member of a campus or district team?

  6. RtI Foundations for Success • Multiple Tiers of Instruction and Assessment • Using Data: Balanced Assessments • Technology • Highly Qualified Staff

  7. The strongest processes that show sustained student growth are those that go beyond technical adequacy…. They are ones that promote a cultural responsiveness to the learning needs of all students (think Tier 1- 80%) and are not dependent on a rote “decision rule” of six points on a graph.

  8. Let’s start at the beginning…. RtI Is not simply implementing a different type of problem solving. It also involves giving up certain beliefs in favor of others. Systems will need to change….

  9. Team Membership • Actively seek out to be an “Ad Hoc” member of the team • Enthusiastically volunteer information to aid in Tier 1 differentiation of information. • Emphasize the value of having you consult way before a referral is initiated. • Offer to aid in the development of the progress monitoring tools. • Provide mini-skill lessons on understanding various aspects of assessment.

  10. Leadership The road to student success begins here….

  11. Critical Leadership in RtI

  12. Strong Administrators Ensure fidelity by having meaningful conversations with staff about data. Create a culture of common values and work together to achieve common goals. Provide clear staff expectations Creatively allocate limited resources to ensure personnel have access to necessary supports.

  13. Essential Tasks for Both Gen Ed and SPED Team

  14. Campus Culture

  15. Variables affecting Culture • Resiliency: Over 40% of teachers do not make it to their 5th year of teaching- many leave by year 3. • Encouragement of Innovation: PD to support advances in technology. Teachers reinforced and encouraged for “thinking outside the box”. • Quality of Student teacher relationships

  16. The most important aspect of a strong RtI process is the richness of the conversations that occur because of the layers of multiple occurring data sources.

  17. Question Are you an active participant in PLC meetings?

  18. It is essential to implement bothProfessional Learning Communities (PLC) and Response to Intervention (RTI) because these complementary processes are considered research-based best practicesto improve student learning.

  19. Connections • What exactly do we expect all students to learn? • How will we know if they’ve learned it? • How will we respond when some students don’t learn it? • How will we respond when some students have already learned? • Core program • Standards • Alignment Documents

  20. Connections • What exactly do we expect all students to learn? • How will we know if they’ve learned it? • How will we respond when some students don’t learn it? • How will we respond when some students have already learned? • Progress monitoring • Universal screener • Diagnostic assessments • Formative Assessments

  21. Connections • What exactly do we expect all students to learn? • How will we know if they’ve learned it? • How will we respond when some students don’t learn it? • How will we respond when some students have already learned? • Differentiated Strategies • Interventions • Decision rules • Protocol

  22. Connections • What exactly do we expect all students to learn? • How will we know if they’ve learned it? • How will we respond when some students don’t learn it? • How will we respond when some students have already learned? • District Expectations • Decision rules • Protocol

  23. Underscoring a Problem “Most teachers just do not possess the skills to collect data, draw meaningful conclusions, focus instruction, and appropriately follow up to assess results. That is not the set of skills we were hired to do.” How can you help to ensure fidelity of data?

  24. Balancing Assessments -- Assessment systems -- Multiple measures -- Varied types -- Varied purposes -- Varied data sets -- Balanced with needs

  25. Align Data Sources • Does the data tell a clear and concise story of the student’s learning? • If there is inconsistency team must investigate why • Review integrity of instruction • Align to student needs • Student variables Universal Screening Progress Monitoring Diagnostic Assessments • Outcome Assessments

  26. You must have multiple sources of data to have effective data-driven instruction.With that said, assessing students whilethey are learning yields real time data to steer teachers towards differentiated practices.

  27. Assessment and Instruction are inseparable. “Assessment is today’s means of understanding how to modify tomorrow’s instruction.” Carol Tomlinson

  28. Components Addressed When Using Multiple Data Sources • The interrelationship between classroom achievement and cognitive processing criteria • Classroom achievement • Academic Deficit (RtI) • Cognitive Processing • Behavior

  29. Data to Consider

  30. Problem Identification Is the Tier 1 Core curriculum effective? (District Data) • The percentage of students (aggregated or sub-groups) meeting proficiency on the state standards as measured by the statewide assessment. • Universal Screening Trends

  31. Characteristics of a Strong Data Team • Process of Collecting Meaningful Data • Culture of Collaboration • There is a process to measure where students are in the curriculum. • There is a RtI plan in the school district to help students who are not achieving or who are excelling.

  32. Problem Identification • Review existing information • Determine student’s functional level • Identify initial concerns • Analyze multiple data sources • Operationally define the problem

  33. Problem Identification • School level: The percentage of students who are at benchmark on the fall, winter and spring screening assessment is not increasing. • Who are the students? • Do the data suggest a sub-group? • Has their risk level increased (benchmark to strategic or strategic to intensive)? • Is a clear pattern of skill deficits evident?

  34. Problem Identification • Grade level: Students in certain grades are not making adequate progress. • Has the staff been provided adequate professional development and training on the curriculum? • Has fidelity of implementation been addressed? • Can root causes be identified? • Class level: Instructional groups are not making growth at the expected rate. • Are the interventions matched to student needs?

  35. Existing Data Review • Determine the Student’s Current Classroom Status: Academic Progress and Work Samples • Teacher Describes and quantifies concerns • Review of Records • Parent Contact(s) • Medical Information • Classroom Observations (ICEL)

  36. Problem Identification • Student level: The student is not making the same amount of progress as other students in the instructional group. • What skills has the student not mastered? • Has a diagnostic assessment been administered?

  37. The two most common reasons for less than expected rate of student progress are: • A mismatch between instruction and learner needs • Fidelity of implementation

  38. Determine Student Functional Levels • Identify assets and weaknesses • Identify Critical Life Events, Milestones, Circumstances (Positive and Negative) • Identify medical and/or physiological sources of concern • Identify academic variables such as “speed of acquisition” or retention of information • Identify issues of attendance, transitions, motivation, access to instruction

  39. Professional Judgment Interpretation Issues

  40. Suspected Disability? ID: What to look for in the data: Screening: Below cut score across the board Diagnostics: Focused Skill deficits and patterns across many areas (mostly pattern of weaknesses) Progress Monitoring: ROI would be slow and possible have a downward trend, not variable, slope is evident (not flat-line) Outcome: STAAR failure pervasive, Unit and District assessments in bottom percentile

  41. Suspected ID/ Slower Cognitive Processing: • I: Student instructional level significantly below grade level, often times manipulatives, graphic organizers needed, slow (not variable) progress, well below grade level expectations. • C: Curricular mismatch is evident across academic areas • E: Student performs best in environment that is highly structured, highly organized, rules posted, high degree of task analysis needed • L: Student demonstrates adaptive skill weaknesses, difficulty with use of learning strategies independently, social skill weaknesses

  42. Reminder (ID) • Children with ID will not likely display a flat cognitive profile on comprehensive assessments of cognitive abilities • ID is usually evident when data indicates there is one (or more) impaired cognitive ability with high centrality that lower the functioning of the whole system • As a group, students identified with ID have lower scores on all CHC factors

  43. Suspected Disability SLD: What to look for in the data: • Data that shows appropriate instruction and data-based documentation of progress in some academic areas • Does not achieve adequately for age or meet state-approved grade-level standards • Does not make sufficient progress …response to scientific, research-based intervention… Screening: District Cut Score on US (Should be above in some areas) Diagnostics: Reading, Math, Writing Progress Monitoring: Grades, formative assessments, unit tests, district common assessments, RtI CBM’s (ROI)- variable data results, however grade expectations in some areas Outcome: Summative Assessments, Report card grades, STAAR, Review objectives met/not met

  44. Suspected SLD: • I: Grade level in some areas, below grade level in others • C: Differentiated strategies based upon learning style will vary depending on academic area • E: Student displays differing degrees of AE based upon content and delivery, performs better in small group with instruction aligned to learning preferences • L: Most often demonstrates increased off task behaviors in area of weaknesses, family history may include learning problems, medical history positive for certain “red flags”, development is positive for specific deficit and skill acquisition.

  45. Professional Judgment: Test Selection Based Upon Multiple Sources “Pick the battery that best fit the student and the referral concern” (Misak, 2013) Focus selection of narrows dependent on data related to Tiered instruction on specific skill deficits. Do you have enough fidelity to do this? Does RtI team give you enough data? What is sufficient for ROI data pts? Norms? Comparison to peers?

  46. Reminder • All G’s are involved in all learning – What is required for learning determines involvement of each and will differ. • Some G’s (Gc, Gf) affect learning across all academic areas. • Within each G, specific narrow abilities are more directly related to specific academic skills – these narrow abilities need to be measured for LD patterns.

  47. FIE Test Selection • Review RIOT/ICEL and all RTI data- determine reason for referral • Carefully select measures- watch for variance • Do not want to use too many measures • Need to measure the appropriate narrow abilities • Also may need to measure constructs such as executive function, orthographic processing, etc. • Select a core battery and the relevant tests to give and then supplement appropriately

  48. Converge Data Professional Judgment Recommendations

  49. FIE Language Reason for Referral Student was referred for a comprehensive Full and Individual Evaluation by the campus RTI committee. Student has participated in Tiers 1, 2 and 3 intensive instruction and intervention in the area of basic reading skills and comprehension and continues to evidence poor progress within grade level and instructional level curriculum.

  50. Achievement Data • In addition to reporting your review of assessment data, include such data as: • US: Student participated in district-wide screening on Aimsweb BOY scores indicate…. Or Scan and import data • Scan and or report PM data:

More Related