510 likes | 646 Vues
This piece explores the current assessments regarding student performance, emphasizing the significance of Curriculum-Based Measurement (CBM). It discusses available data concerning student performance, its utilization in referral and assessment processes, and identifies unmet data needs. It also highlights the evolution of assessment practices, the relevance of dynamic indicators of basic skills, and the advantages of using CBM over traditional assessments. Lastly, the document addresses factors like progress monitoring, validity, reliability, and its impact on instructional decisions in special education.
E N D
CURRENT ASSESSMENTS • What data on student performance is available? • How is it used in referral/assessment process? • What data needs are being met? • What data needs are not being met? • How do you know whether or not a program is benefiting a student(s)? CHRISTO/CSUS/246/2003/CBM
CURRENT TRENDS • Use of IQ/Achievement discrepancy • Response to intervention • Need for monitoring of progress on short term basis • Office of Special Education • National Joint Council on Learning Disabilities • President’s Commission on Special Education CHRISTO/CSUS/246/2003/CBM
ROOTS OF CBM • Deno at University of Minnesota Institute for Research on Learning Disabilities (Deno, 1986) • Effort to develop and validate simple methods for use in IEP’s • CASP Presentation, (Herdman, Leaman, and Chartard, 1990) CHRISTO/CSUS/246/2003/CBM
ASSESSMENT NEEDS • Use curriculum • Short in duration • Multiple forms • Inexpensive • User friendly • Show improvement over time • Research based CHRISTO/CSUS/246/2003/CBM
MEASURES IDENFITIED • Considered simple measures in reading, written language and math • Data-based assessment • CBA CHRISTO/CSUS/246/2003/CBM
CHARACTERISTICS OF ALL CBA MODELS • Test stimuli drawn from the curriculum • Repeated testing occurs over time • Useful in instructional planning CHRISTO/CSUS/246/2003/CBM
CBA DIFFER IN TERMS OF: • Long vs. short term goals • Short term objectives • Task analysis • Emphasis on fluency • Use in time series analysis CHRISTO/CSUS/246/2003/CBM
ADVANTAGES OVER OTHER TYPES OF CBA • Focus on end of year goal, not sequential skill analysis • Can evaluate variety of instructional methods • Automatically assesses retention and generalization • Don’t need to alter testing strategies • Avoid issues with measurement shift • Can be normed locally CHRISTO/CSUS/246/2003/CBM
DESCRIPTION OF CBM • Normed assessment from which you can develop local criteria • Dynamic (sensitive) Indicator (correlates) of Basic Skills (not content areas) • Uses local curriculum • Formative evaluation • Use in a problem solving model • Uses at individual, class, and school levels CHRISTO/CSUS/246/2003/CBM
CBM AS DYNAMIC INDICATORS OF BASIC SKILLS • DYNAMIC = sensitive to short term effects in assessing growth • INDICATORS = correlates of key behaviors indicative of overall academic performance • BASIC SKILLS = assess basic skills not content areas • (Mark Shinn, 1998) CHRISTO/CSUS/246/2003/CBM
CBM IS A MEASURE OF PROFICIENCY PROFICIENCY FLUENCY= ACCURACY SPEED MASTERY ACCURACY TEACHING ACQUISITION CHRISTO/CSUS/246/2003/CBM
READING FLUENCY • Indicator of automaticity of important skills • Strong predictor of reading comprehension • Critical element of competent reading • Oral reading a better indicator than silent CHRISTO/CSUS/246/2003/CBM
DIFFERENCES FROM TRADITIONAL MEASURES • Does not try to determine why child is having trouble • But how different from the norm • And is he getting better? CHRISTO/CSUS/246/2003/CBM
SHIFT TO PROBLEM SOLVING FOCUS • Disability vs. handicap • Educational problems as handicaps • Difference between performance and expectation CHRISTO/CSUS/246/2003/CBM
CBM DOES NOT • Give national normative data • Provide broad band information • Is not diagnostic • Although error analysis (or qualitative evaluation of reading) can be used to provide further information CHRISTO/CSUS/246/2003/CBM
VALIDITY AND RELIABILITY • Construct validity • Theory and research support • Concurrent, criterion related validity • Highest for reading • Test/retest reliability • Predictive/concurrent validity (Christo and Southwell, 2001; Good, Simmons and Kame’enui, 2001; Marston, 1989; Shinn, M.R., Good, R.H., Knutson, N., Tilly, W.D., & Collins, V.L., 1992) CHRISTO/CSUS/246/2003/CBM
LEGAL DEFINSIBILITY • Directly from curriculum so social and cultural bias is reduced • Reliability and validity are high • Answers need for instructional utility CHRISTO/CSUS/246/2003/CBM
ACCOUNTABILITY • CBM can document effectiveness by showing change over time • Provides a baseline of performance to determine if related services are leading to change over time • Achievement and accountability decisions are made on basis of classroom performance CHRISTO/CSUS/246/2003/CBM
STAFF ACCEPTANCE • Eliminated jargon and ambiguity • Procedures allowed them to follow intent of law • Testing more relevant • Confidence in test results • Can compare to peers • Improved communication with parents • Motivating to students to see growth CHRISTO/CSUS/246/2003/CBM
RESEARCH BASE • Use in informing instructional decisions • Math (Fuchs, Fuchs, 1989) • Reading (Good, R.H. and Kaminski R.A., 1996). • Use in identifying students at risk of academic difficulties • Use in re-integration decisions (Shinn, 1998) • Language minority students CHRISTO/CSUS/246/2003/CBM
DEVELOPING PROBES • Developed from student’s actual curriculum • Allow for quick administration and scoring • Reading probes • Math probes • Spelling • Written language CHRISTO/CSUS/246/2003/CBM
READING PROBE • One on one administration • Three one-minute tests • Score is number of correct words read • Errors noted • Median score • Grade level reading rates CHRISTO/CSUS/246/2003/CBM
MATH PROBE • Variety of types of problems the student will encounter • Group administration • Three to five minute test • “Correct digits” is the number of digits in the correct place on each problem CHRISTO/CSUS/246/2003/CBM
OTHER SUBJECT AREAS • Spelling – correct letter sequence • Writing – Total words written, words spelled correctly, CHRISTO/CSUS/246/2003/CBM
POCKET CBM DATA • Cover Sheet • Quartile Distribution ( The main graph) • Frequency of Scores (Curriculum Planning) • Percentile Rank • Rank Order • Teacher List • School-wide Progress CHRISTO/CSUS/246/2003/CBM
DETERMINING NORMS • By hand • By spreadsheet • With Pocket CBM • On line programs • Other software CHRISTO/CSUS/246/2003/CBM
STEP 1: INITIAL REFERRAL • Difference between student performance and expectation • Peer or norm referenced • Look for discrepancy ratio or cutoff • Make decision regarding further assessment CHRISTO/CSUS/246/2003/CBM
DETERMINING DISCREPANCY • Discrepancy ratio is greater than 2 • Peer median/Student median • 100/40 = 2.5 • Criterion scores • Well below instructional range • Will vary with grade • Percentile rank CHRISTO/CSUS/246/2003/CBM
STEP 2: INVESTIGATE PROBLEM • How severe is the problem? • What general education services can be used? • Survey level assessment CHRISTO/CSUS/246/2003/CBM
STEP 3: SETTING EXPECTATIONS/ GOALS • Response to intervention model • SST • IEP • Long-term vs. short-term measurement • Determining goal (instructional range of classroom) • Peer referenced • Minimum competence • Expert judgment • Reasonable growth CHRISTO/CSUS/246/2003/CBM
STEP 4: MONITORING PROGRESS • Establish baseline • Plotting growth • Aimlines and trendlines • By hand • Using Excel • Using commercial software CHRISTO/CSUS/246/2003/CBM
STEP 5: DECISION POINT • Is student within instructional range of classroom? (LRE) • Response to intervention model: • More in-depth assessment • More intensive services • In special education process • Exit special education • Re-consider services being provided CHRISTO/CSUS/246/2003/CBM
Referral • Is the student performing significantly different from his/her peers? • 4th grader, Malcolm • Reading 30 cwpm • Class median is 90 • Places him at 20th percentile CHRISTO/CSUS/246/2003/CBM
Investigate Problem • How severe is the problem? • Survey level assessment • In 3rd grade text at 25th percentile • In 2nd grade text at 35th percentile CHRISTO/CSUS/246/2003/CBM
Setting Expectations/Goals • Were does he need to be? • End of year • To show progress • Expected rate of progress for effective intervention. • What do we know about response rates for effective interventions? • Set goal for review. CHRISTO/CSUS/246/2003/CBM
Monitoring Progress • Is Malcolm making acceptable progress • Meeting trendline? • Change goal? CHRISTO/CSUS/246/2003/CBM
CLASS LEVEL • Provide teachers with class profile • Parent reports • Program evaluations • STAR Alternate assessment CHRISTO/CSUS/246/2003/CBM
SCHOOL LEVEL • Screening: acts as safety net • Establish school wide norms • Information to new parents • Retention/summer school decisions CHRISTO/CSUS/246/2003/CBM
WAYS TO IMPLEMENT CBM • One class • One school • A few teachers • Variety of ways to develop norms CHRISTO/CSUS/246/2003/CBM
TWO TYPES OF CBM FOR PRIMARY STUDENTS • DIBELS for prereaders or delayed readers • onset recognition • phonemic segmentation • Oral reading probes for beginning readers CHRISTO/CSUS/246/2003/CBM
DIBELS • Dynamic Indicators of Basic Early Literacy Skills • Skills important to development of literacy • Marker skills that can identify students needing early intervention CHRISTO/CSUS/246/2003/CBM
ARE STUDENTS ACHIEVING FOUNDATIONAL SKILLS? • Good, Simmons, Kame’enui (2001) • Establish benchmarks • Use benchmarks to determine students at risk of not achieving next benchmark • Importance of fluency as opposed to accuracy • Other studies CHRISTO/CSUS/246/2003/CBM
CONTINUUM OF SKILLS (Good, Simmons, Kame’enui) • Kindergarten • Phonological awareness (onset rhyme fluency, phonemic segmentation fluency) • First Grade • Alphabet principle (nonsense word fluency) • Accuracy and fluency with connected text (oral reading fluency) • Second Grade • Accuracy and fluency with connected text (oral reading fluency) CHRISTO/CSUS/246/2003/CBM
IMPLEMENTATION • Decide on Model (individual, class, school, district) • Support from Stakeholders • Develop Timeline • Identify Personnel Resources • Staff Training • Computer Needs • Assessment • Distribution of Results CHRISTO/CSUS/246/2003/CBM