1 / 70

Academic Assessment & Instruction/Interventions

Academic Assessment & Instruction/Interventions. Utah Coaching Network. Reconnect. Brief Review – Yesterday ’ s Outcomes Partner share – One “ Big Idea ” Group Responses Today ’ s Objectives. Review – Who We Are. Expertise – 538 years of experience Bright Spot

slone
Télécharger la présentation

Academic Assessment & Instruction/Interventions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Academic Assessment & Instruction/Interventions Utah Coaching Network

  2. Reconnect • Brief Review – Yesterday’s Outcomes • Partner share – One “Big Idea” • Group Responses • Today’s Objectives

  3. Review – Who We Are • Expertise – 538 years of experience • Bright Spot • successful efforts worth emulating • Positive Deviants • constantly expanding vision and are always looking for a better way • Change because you see the light, not because you feel the heat.

  4. Review - Coaching • Coaching is . . . • Prompts • Corrections • FACTS • F – feedback – I, We, You • A - adjusted • C – context • T – time - ongoing • S – student focused

  5. Review – Observation Tools • Basic Five • Ratio of Interactions (see CHAMPS, p. 212-214, 253) • Opportunities to Respond (OTRs) • Error Correction • Disruptions • Academic Engagements • Instructional Routines • Data Summary • Self Evaluation

  6. CHAMPS Review • What does STOIC stand for? • What does CHAMPS stand for? • Focus on 1)______________ • Focus on 2) ______________ • Each chapter starts with a list of T_________ • Each chapter ends with a S_____ __________

  7. Objectives • Consider integration of behavior & academics • Learn levels & purposes of assessment • Increase knowledge and skills in formative assessment & intervention practices • CBM administration • Diagnostic assessment • Intervention selection & implementation • Progress monitoring • Data management tools • Consider coaching practices

  8. Good Teaching is Good Teaching Good teaching is good teaching and there are NO boundaries on when, where, or for what or whom it will occur Teaching academics without attention to behavior IS NOT evidence based practice Teaching behavior without attention to academics is unsound practice In efforts to improve achievement, they cannot be separated Algozzine, 2008

  9. THE BOTTOM LINE Are we matching instruction to student need?

  10. Significance The Need for Academic & Behavioral Integration BL R B R B R B Reading and Behavior Instruction Reading Instruction Behavior Instruction Source: Shepard Kellam, Ph.D, Senior Research Fellow, American Institutes for Research (AIR)

  11. We Know What to Do! • What teachers and kids need is support! • They need “personal trainers” to use data and implement academic and behavior interventions. • They need a coach – they need you!

  12. It’s As Easy As . . . A B C

  13. Coaching = Improved Student Outcomes • It’s as Easy as 1, 2, 3 . . . • Anita’s Insights • Fidelity • Delivery Skills • Utilization of Data

  14. Coaches’ Impact On Students

  15. Basic Goal of Assessment • The ultimate goal of assessment is to identify problems with instruction and to lead to instructional modifications. A good share of present-day assessment activities consist of little more than meddling…We must use assessment data to improve instruction…The only way to determine the effectiveness of instruction is to collect data. • Ysseldyke and Algozzine (1995) Curriculum-Based Measurement: Introduction

  16. CBM – What is It? • Formative assessment • Measure of student performance over time • An analysis of specific skill on an individual student • Tool • Identifying struggling students • Set goals • Align instruction with desired outcomes • Provides diagnostic information • Progress monitor • IEP development

  17. Formative vs. Summative “When the cook tastes the soup, that’s formative. When the guests taste the soup, that’s summative.” - Robert Stake

  18. Why CBM?

  19. Why Should I Do It? • Measure of class-wide performance • An alternative to other assessment procedures– often replaces costly, time-consuming, disruptive practices • Quick & Easy • Establishes reliability & validity • Direct low-inference measures • Can be easily summarized & presented • Parents, students, colleagues

  20. What is the Goal of CBM? • Goal is two-fold: • Monitor student progress • Inform instruction / teacher practice

  21. Applications of CBM • Benchmarking • Diagnostic • Can’t do/won’t do • Survey Level Assessment • Error analysis • Intervention development • Progress Monitoring • Instructional/criterion-referenced

  22. Applications of CBM • When? • How Often? • Why (purpose)? • Who? • How?

  23. Group Diagnostic Continue with Core Instruction Individual Instruction Benchmarks Grades Classroom Assessments Utah CRT Individual Diagnostic Intensive 5% All students at a grade level None Small Group Differentiated by Skill Weekly Targeted 15% 2x month Universal Screening Universal 80% Fall Winter Spring How Does It Fit Together? Step 1 Step 2 Step 3 Step 4 Additional Diagnostic Assessment Instruction/Intervention Progress Monitoring

  24. What is Reading CBM? • One-minute probe (e.g., DIBELS, 6-min. Solution) • Administered individually • Provide intervention and progress monitor at instructional level • Different measures • Oral Reading Fluency (ORF) • Maze (Comprehension) • Early-reading (Initial Sound, Phoneme Segmentation, Nonsense Word, Letter Naming Fluency) (See Chapters 3 & 4 in ABCs of CBM)

  25. Reading CBM - How? • Select appropriate material for probe • Place probe in front of and facing the student • Keep copy for the examiner (on clipboard) • Provide directions • Start timer • Have student perform task for allotted time (1 minute for reading tasks) • Score probe • Display data on graph/chart • Video Clips . . . . Examples

  26. We Do It-Guided Practice • Triads work together • Administer reading fluency probe • Score probe – count number correct and number of errors • Record the score • Switch roles & repeat • Questions & answers – feedback

  27. Scoring Reading Probe • Oral Reading Fluency: • Mark as correct • # of words read correctly in one minute • Mark as incorrect: • Misread words • Omissions • Hesitations - words read by assessor (read after 3 seconds) • Reversals – two or more words not read in order (see page 146 in ABCs of CBM)

  28. Diagnostic - Can’t Do/Won’t Do • Purpose • Determine motivation vs. skill deficit • Technique • Administer same probe – add incentive • Timing - Soon after benchmark/screener • Decision Rules • >=15% increase=motivation (Witt & Beck, 1999) • <15% skill deficit • Consider both

  29. Can’t/Won’t Do • Triad practice • Score (p. 48 Benchmarks) • Can’t/Won’t? • Decision? • Trial 1 (reading): • Annie: • 4th grade 65 cwpm (Fall)

  30. Diagnostic Survey-Level Assessment Purposes • To determine the appropriate instructional placement level for the student. • The highest level of materials that the student can be expected to benefit from instruction. • To provide baseline data, or a starting point, for progress monitoring • In order to monitor progress toward a future goal, you need to know how the student is currently performing.

  31. Survey Level Assessment-Reading 1. Start with grade level passages/worksheets (probes) 2. Administer 3 separate probes (at same level of difficulty) using standard CBM procedures. 3. Calculate the median score (i.e. the middle). 4. Is the student’s score within instructional range? • Yes - This is the student’s instructional level. • No - If above level (too easy), administer 3 probes at next level of difficulty. • No - If below level (too hard), administer 3 probes at previous level of difficulty.

  32. Survey Level Assessment – Let’s Try It! • Refer to Case Studies Provided . . . • Completed Forms • B. Blue • Jack Horner • Sample One – Junie B. – Whole Class • Sample Two – Tom – Partner • Consider instructional levels for sample cases

  33. Reading CBM – Norms & Growth(see pages 47 & 49) • Norms • Compare student’s score to the performance of others in her grade or at her instructional level • Data collected on thousands of students – numbers are very similar • Growth Rates • Provide an indication of the average number of words per week we would expect students to improve • Not necessarily new words - students reading same words at a faster rate each week

  34. Expected Growth Rates • Benchmarks - Table 3.4 (p. 48) • Norms – Table 3.5 (p. 49) • Growth Rates – Table 3.2 (p. 47) • greater progress is possible • If student doesn’t make adequate progress, it doesn’t mean she lacks the ability to learn reading– it means instruction needs to be changed!

  35. What if Student Data Doesn’t Reflect Adequate Growth? • It is our obligation to fix the problem! • Build up prerequisite skills • Increase length of daily lesson • Alter way we respond when error is made • We do NOT lower expectations! “Learning is a result of instruction, so when the rate of learning is inadequate, it doesn’t always mean there is something wrong with the student. It does mean the instruction needs to be changed to better meet the student’s needs.” (p. 47)

  36. How to Set & Graph Goals • 1. End of Year Benchmarks • 2. Norms - Levels of performance • 3. Rate of progress – goal setting • (# of weeks x growth rate) + median baseline = goal • Students with greatest deficits need steepest slopes – more intense & effective interventions

  37. Goal Setting - Let’s Practice! • Case Study # 1 • Jack – 4th grader – reading data • 3rd grade level 78/2, 4th grade level 65/ 3 • compute for 10 weeks and annual goal • Case Study # 2 • Suzie – 5th grader – reading data • 3rd grade level - 71/3, 4th grade level – 62/6 • Compute for 10 weeks and annual goal

  38. How Often Should Data Be Collected? • Three considerations: • 1. Purpose – screening vs. progress mon. • 2. Importance of task – learning to read vs. learning Roman numerals • 3. Significance of problem – student’s difficulty increases need effective instruction need more frequent monitoring

  39. Assessment and MTSS Tier 3 Tier 2 Measurement Precision Measurement Frequency Problem Analysis Tier 1 Adapted from Burns & Riley-Tillman (2010)

  40. Assessment and MTSS • Tier III – Identify discrepancy for individual. Identify causal variable. Implement individual intervention. • Tier II – Identify discrepancy for individual. Identify category of problem. Assign small group solution. • Tier I – Identify discrepancy between expectation and performance for class or individual. Tier 3 Tier 2 Measurement Precision Measurement Frequency Problem Analysis Tier 1 Adapted from Burns & Riley-Tillman (2010)

  41. Next Steps . . . • Goal Setting • Intervention Selection • Intervention implementation • Progress monitoring • Improved student outcomes • Engage in Problem Solving Process!

  42. Oral Reading Fluency Progress Monitoring Diagnostic-FBA Implement Intervention

  43. Academic Interventions • Consider work of “Bright Spot” • Jerry Sternin – V • Your School/District Team • Focus on another “Bright spot” • Mt. View Elementary

  44. Procedure - Mt. View Elementary • 1st-6th grade students were evaluated using three grade-level reading probes. Each probe was conducted for one minute. • Words read per minute and errors were tracked. • The median (middle) correct words per minute were recorded from the 3 samples. • The median errors per minute were recorded from the 3 samples. • All data was entered on excel data system (CBM Focus) and each teacher was given their individual class graph.

  45. Outcomes • 2008 AIMSweb Norms for Oral Reading Fluency were used. (ABCs of CBM pg. 49) • 25th percentile norms for winter were used to identify if student achieved “fluidity”. • 95% accuracy was used for determining accurate or inaccurate. (http://reading.uoregon.edu/flu/flu_programs.php) • Students were grouped based on outcomes into 4 quadrants.

  46. 4 Quadrants • Fluid, Accurate • Correct Words Per Minute (CWPM), and accuracy were at or above expected levels. (See individual data sheet for the CWPM cut-off for your grade level). • Fluid, Inaccurate • CWPM at or above expected level, but accuracy below 95%. • Slow, Accurate • CWPM below expected level, but accuracy was at or above 95%. • Slow, Inaccurate • CWPM below expected level, and accuracy below 95%.

  47. School Wide Data Based on 25th Percentile

More Related