1 / 94

An Introduction to Curriculum-Based Evaluation in Reading

An Introduction to Curriculum-Based Evaluation in Reading. Jason Harlacher, PhD, NCSP Colorado Department of Education, Denver, CO Tami Sakelaris, PhD, NCSP Kelly Humphreys, PhD, NCSP Washoe County School District, Reno, NV February 22, 2011. Objective.

Télécharger la présentation

An Introduction to Curriculum-Based Evaluation in Reading

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An Introduction toCurriculum-Based Evaluationin Reading Jason Harlacher, PhD, NCSP Colorado Department of Education, Denver, CO Tami Sakelaris, PhD, NCSP Kelly Humphreys, PhD, NCSP Washoe County School District, Reno, NV February 22, 2011

  2. Objective • For you to leave here today with beginning knowledge of how to use CBE within a three-tiered problem-solving model. • Initial tools to conduct CBE in reading • List of resources to further your education of CBE

  3. Agenda • What is Curriculum-Based Evaluation (CBE)? • Context of CBE • CBE in Reading Overview • Decoding • Case Examples • Group practice • Questions?

  4. One Perspective on History SpecialEducation Title I Regular Education Parents Tutors Talented & Gifted After-School ELL/ESL

  5. Background of education • What beliefs went along with how school was structured during this time? • Homogenous groups (based on labels) • Teaching to the middle • Unique learning styles

  6. A different time 1950: 152 million; 2006: 300 million http://www.tvhistory.tv/Annual_TV_Households_50-78.JPG; US Census bureau; http://www2.census.gov/prod2/statcomp/documents/1951-02.pdf

  7. State of the Field • Mathematics: • 39% of fourth-graders, 34% of eighth-graders scored at Proficient level (NAEP, 2009) • No change in 4th grade scores from 2007 to 2009 (NAEP, 2009) • Reading: • 32 million adults (14%) are identified as illiterate, 50 million can’t read above 4th grade level(education-portal.com) • 32% of fourth-graders, 33% of eighth-graders, scored at Proficient level (NAEP, 2007) • No change in 4th grade scores from 2007 to 2009 (NAEP, 2009)

  8. A Modern View? • Moving away from working in silos to a more collaborative, problem-solving focus • Context of Curriculum-Based Evaluation is a problem-solving model

  9. What is Curriculum-Based Evaluation? • “CBE is a systematic problem-solving process for making education decisions.” (Howell et al., 2008; p. 353). • What to teach and how to teach it. • CBE is not Curriculum-Based Measurement or Curriculum-Based Assessment. • CBM is a measurement method. • CBA is determining status of student in a curriculum (umbrella term).

  10. Assumptions behind CBE • THINK-PAIR-SHARE: What is your belief about how children learn? 1. Learning is an interaction between the learner, curriculum, and environment (instruction). (Howell & Nolet, 2000)

  11. Assumptions behind CBE 2. Students with learning difficulties lack specific skills and have an apparent difficulty learning new information. These problems result from: Missing or erroneous PRIOR KNOWLEDGE • Tasks are difficult when we don’t have adequate prior knowledge and skill to do them. • Tasks are difficult if we have ambiguous cues, missing information, and lack of predictability

  12. http://viscog.beckman.illinois.edu/grafs/demos/15.html • http://www.youtube.com/watch?v=vJG698U2Mvo

  13. Information Processing Model Long-Term Memory (Stores) Environment Short-Term Memory (Works) Receptors (Receive) SensoryMemory (Codes) • Executive Control: • Selective Attention • Memory Recall • Motivation • Interpretation • Self-monitoring • Problem-solving • Automation

  14. Assumptions behind CBE 3. Focus on alterable variables as a means to improve student performance. • Alterable variables are those that a teacher can reasonably change through process of instruction • Examples: • Prior knowledge (Executive control strategies) • Sequence & structure of curriculum • Classroom management • Non-examples: • Gender • Race • Family size & background “…those who use CBE think about learning problems differently.” (Howell & Nolet, 2000: p.361)

  15. Assumptions behind CBE 4. A problem is defined as the gap between the expected level of performance and the observed level of performance. Problem = Expected – Observed (P = E – O)

  16. Context of CBE • Housed within a tiered, problem-solving model • Reflective of beliefs that the right instruction (matched to student need) can lead to beneficial outcomes for any student

  17. Agenda • What is Curriculum-Based Evaluation (CBE)? • Context of CBE • CBE in Reading Overview • Decoding • Case Examples • Group practice • Questions?

  18. Terms to Know Two main assessments: • Survey-Level Assessment: • Assessments used to test a student on a wide range of skills • Specific-Level Assessment: • Assessments used to focus on a narrow range of variables that are suspected to contribute to the problem

  19. Step Action Survey-Level Assessment What is the problem? Step 1:Fact-Finding What is student doing now? Summarize discrepancy Step 2:Develop Assumed Causes Develop Assumed Causes • What prior knowledge is missing? • Consider ICEL and assumptions of CBE Why is it occurring? Step 3:Validate Specific-Level Assessment(s) Are explanations in Step 2 correct? Step 4:Summative Decision-Making What can be done? Set Goals and Objectives Step 5:Formative Decision-Making Monitor Progress Did it work? Is instruction effective/appropriate?

  20. 1. Fact-Finding(Survey-Level Assessment) • Select and conduct survey-level assessment. • Summarize results • Expected level compared to Observed level. Discrepancy?

  21. 2. Develop Assumed Causes • Examine data and results collected thus far to determine hypotheses. Identify reasons why the student is discrepant. • “If/then” considerations. • If the student cannot ____, then it is because they are missing the background knowledge to do that skill.

  22. 2. Develop Assumed Causes • Remember the assumptions behind CBE and the focus on alterable variables: (Howell et al., 2000)

  23. 3. Validate(Specific-Level Assessment) • Administer assessments to validate or invalidate hypotheses. • Assessments should parcel out specific skills or instructional factors so that you can make decisions about the assumed cause. • Can be a variety of assessments, but most common will be CBA and CBM.

  24. 4. Summative Decision Making • Summarize the information collected so far and select goals and objectives. Develop instructional plan • When (condition), (student) will (behavior) with (criterion), by (time). • When given a 3rd grade reading passage, Bart will read 66 WRC with at least 95% accuracy by Jan 15, 2011.

  25. 5. Formative Decision Making • To confirm and check that the instructional plan is effective for the student.

  26. Ready for a Headache? Handouts 1 and 2(Howell & Nolet, 2000)

  27. Step 1: Fact-Finding (Survey-Level) Step 2: Develop Assumed Causes Step 3: Validate (Specific-Level)

  28. Specific-Level Assessment:Self-Monitoring • When a student is inaccurate, we should first ask if they are inaccurate because of a monitoring issue, or because they lack the specific decoding skills. • Three methods: • Give them a “pep talk” • Have them read previous errors in isolation. • Conduct a “pencil tap” task to see. • Handout 3, 3a, and 3b • Pair up. Person with newer shoes is the student.

  29. Specific-Level Assessment: Fluency Reread • When a student is accurate, but slow, determine if fluency is a skill to target during instruction. • Handout 4, 4 a, 4b • Pair up with a new neighbor. Person with more work experience goes first.

  30. Agenda • What is Curriculum-Based Evaluation (CBE)? • Context of CBE • CBE in Reading Overview • Decoding • Case Examples • Group practice • Questions?

  31. Case Example Maggie: • 3rd grader with history of reading difficulties • Conducted CBE as part of her re-eval • Fall screening data on R-CBM was 43 words with 84% accuracy (3rd grade probes) • District risk criterion is 72 words read correct • Problem: 72 WRC – 43 WRC = 29

  32. Survey-Level Assessmentin Reading • Using R-CBM (Oral Reading Fluency) procedures, administer 3 passages from the student’s grade-level. Take the median score and look for a reading rate at the 25th percentile and above, and at least 95%. • Administer 3 passages from lower level until criteria is met.

  33. Step 1. Fact-Finding:Survey-Level Assessment 25th Percentile

  34. Step 1. Fact-Finding:Survey-Level Assessment • Summarize the discrepancy based on grade-level expectation, not standard from Survey-Level Template. • Maggie reads at a rate of 47 WRC with 3rd grade text. She should read at least 72 WRC (25th percentile is 54 WRC). • Maggie reads with 85% accuracy with 3rd grade text. She should be reading at least 95% accuracy with grade-level text. • 72WRC -47 WRC = 25 WRC. • 95% - 85% = 10%

  35. Step 2. Develop Assumed Causes • Examine data and results collected thus far to determine hypotheses. • “If/then” considerations. • If the student cannot ____, then it is because they are missing the background knowledge to do that skill. • Maggie has a gap of 25 WRC because she has not mastered basic phonetic skills.

  36. Step 3. Validate:Specific-Level Assessment • Administer assessments to validate or invalidate hypotheses. • Can be a variety of assessments, but most common will be CBA and CBM • Flowchart can guide you, although as you become fluent, you may jump around on it

  37. Maggie read 47 WRC with 85% accuracy.

  38. Elicit Error Sample • Handout #7: Can gather errors from previous sources and assignments, during Survey-Level (if you record the errors), or by gathering additional errors.

  39. During Survey-Level Assessment, Maggie only self-corrected 2/87 errors (2%)

  40. During Survey-Level Assessment, Maggie only self-corrected 2/87 errors (2%)

  41. Assist Self-Monitoring: Pencil Tap *Handout 3

  42. 16% change in self-corrects (7% vs 23%) (3% improvement with pencil tap)

  43. Categorizing Errors • Evaluate errors that violate meaning. (Handout 8) • Evaluate errors related to general categories (decoding, self-corrects, repetitions, punctuation, etc). (Handout 8) • Evaluate decoding errors(word types, suffixes, prefixes, etc).(Handout 9)

  44. Error Coding

  45. Practice with Error Coding nights/ famly/ O things _____ _____ / ^ _____ O / _______ O O / / Categorize the errors using Handout 8a.

  46. Practice with Error Coding: Answers 1 1 1 4 1 1 I I I I I I I I I

  47. Error Sample

More Related