1 / 92

Assessment 101 – Everything you need to know to efficiently collect academic data in your system

Assessment 101 – Everything you need to know to efficiently collect academic data in your system. CBOCES Professional Development Series 2011/12 Session #2 November 4, 2011 Loveland, CO. Problem Solving.

lauren
Télécharger la présentation

Assessment 101 – Everything you need to know to efficiently collect academic data in your system

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment 101 – Everything you need to know to efficiently collect academic data in your system CBOCES Professional Development Series 2011/12 Session #2 November 4, 2011 Loveland, CO Corey D. Pierce, Ph.D.

  2. Problem Solving If we can really understand the problem, the answer will come out of it, because the answer is not separate from the problem. -Krishnamurti

  3. Corey D. Pierce, Ph.D.

  4. When You Think About Assessment- • What are the questions that need to be answered? • What information do you need to obtain from your evaluation to answer the questions you have? • What will you do to get the information you need? • How will you use the information you gathered for instructional or curricular purposes?

  5. Continuous Improvement Cycle

  6. Historical System Effective Educational Systems Referral Universal Screening General Ed.-Scientifically Validated -Supplemental Treatments: T2 - 3 Non Responders Responders Eligibility Testing SPED Eligibility Evaluation Monitor Not Eligible Eligible Not Eligible Eligible SPED Intensive Treatment ? Non - SPED Intensive Treatment SPED Intensive Treatment Non Responders Responders Non Responders Responders Monitor Recycle Adapted from Fletcher, ’05, Used with Permission

  7. Assessments in Effective Educational Systems • Screening and Benchmark(at grade level)Universal measures that give a quick read on whether students have mastered critical skills. • Progress Monitoring(typically at instructional level)Determines whether adequate progress is made based on individual goals regarding critical skills. • DiagnosticIndividually administered to gain more in-depth information and guide appropriate instruction or intervention plans in the area of concern. • Outcome/SummativeProvides an evaluation of the effectiveness of instruction and indicate student year-end achievement when compared to grade-level performance standards.

  8. Effective systems focus on a process forachieving higher levels of academic and behavioral success for all students through: High Quality Instructional Practice Continuous Review of Student Progress (multiple measures) Collaboration Corey D. Pierce, Ph.D.

  9. Continuous Review of Student Progress A Systemic Approach forConstant Inquiry • To Assess: • How all students are performing (screening) • How they are responding to differentiated core instruction (ongoing assessment) • How they are responding to intervention/additional supports (monitoring progress) Corey D. Pierce, Ph.D.

  10. Balanced Assessment System Key Components: • Continuum of assessments • Multiple users • Multiple information sources, used to create a complete picture of student progress • Each assessment type has a primary purpose, as well as strengths and limitations Corey D. Pierce, Ph.D.

  11. Balanced Assessment System Formative Benchmark Summative Daily Ongoing Evaluation Strategies Periodic Diagnostic/Progress Assessments Large-Scale Standardized Assessments Student-Centered Classroom/School-Centered School /District/State-Centered Immediate Feedback Multiple Data Points Across Time Annual Snapshot Corey D. Pierce, Ph.D.

  12. Corey D. Pierce, Ph.D.

  13. Summative/ Large-Scale Purpose : To determine how students in schools, districts, and states are progressing To inform curriculum and instruction To determine Adequate Yearly Progress (AYP) Corey D. Pierce, Ph.D.

  14. Benchmark Assessment Purpose: To determine to what extent all students are progressing (screening) To determine how well additional supports or services are working before too much time passes (monitoring progress) Corey D. Pierce, Ph.D.

  15. Formative Assessment Purpose: To consider what learning comes next for students To improve learning while there is still time to act – before the graded event Corey D. Pierce, Ph.D.

  16. Benchmark Assessment: Screening • Definitions • Purposes/Rationale • Strengths and Limitations • Common features • Research • Resources for getting started: Academics & Behavior Corey D. Pierce, Ph.D.

  17. Screening: Definition Screening is characterized by fast, inexpensive, repeatable data collection about critical skills, beliefs, or behaviors. Screening usually identifies students who need further assessment or provides information for future planning activities. Corey D. Pierce, Ph.D.

  18. Screening: Purposes/Rationale The purpose of screening is to identify students who are “at-risk” of a poor outcome Rationale: Use a screener with strong statistical properties along with other data to identify students you want to learn more about Don’t wait until it’s too late. CSAPis a poor screener for this reason. Corey D. Pierce, Ph.D.

  19. Screening: Strengths & Limitations By definition, easy, quick, repeatable Immediate results Guide programming Predictive validity Diagnostically Guiding instruction Administrators Teachers Absent good PM and Formative Asmt. Statistical limitations Strengths Limitations: How Misused Corey D. Pierce, Ph.D.

  20. Selected Research on Screening Jenkins, J. R., Hudson, R. F., & Johnson, E. S. (2007). Screening for service delivery in an RTI framework: Candidate measures. School Psychology Review, 36, 560-82. Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed., pp. 13–103) New York: Macmillan. Riedel, B. W. (2007). The relationship between DIBELS, reading comprehension, and vocabulary in urban first-grade students. Reading Research Quarterly, 42, 546–567. Ritchie, K. D., & Speece, D. L. (2004). Early identification of reading disabilities: Current status and new directions. Assessment for Effective Intervention, 29(4), 13–24. Snellen Eye Chart (1862). Corey D. Pierce, Ph.D.

  21. Resources for Screening BASC CBCL Office Referrals Teacher Nomination Go to National Center on RtI (http://www.rti4success.org/screeningTools) Also see The ABCs of CBM by Hosp etc. Behavioral Screening Academic Screening Corey D. Pierce, Ph.D.

  22. Formative (Ongoing) Assessment • Definitions • Purposes/Rationale • Strengths and Limitations • Common features • Research • Resources for getting started: Academics & Behavior Corey D. Pierce, Ph.D.

  23. Formative (Ongoing) Assessment CCSSO, 2007 Definition: “Formative assessment is an intentional and systematicprocess used by teachers and students during instruction that provides feedback to adjuston-going teaching and learning to improve students’ achievement of the intended instructional outcomes.” Corey D. Pierce, Ph.D.

  24. Formative (Ongoing) Assessment intentional systematic process feedback adjust on-going intended instructional outcomes Corey D. Pierce, Ph.D.

  25. Formative (Ongoing) Assessment • Purpose: • To consider what learning comes next for the student • To improve learning while there is still time to act – before the graded event Corey D. Pierce, Ph.D.

  26. Formative (Ongoing) Assessment • Examples: • Teacher observations • Teacher questioning & class discussions • Analysis of student work (graded & non-graded) • Exit questions • Teacher feedback • Student self-assessment • KWLs • Student Journals Corey D. Pierce, Ph.D.

  27. Formative (Ongoing) Assessment • Strengths: • Informs day-to-day instruction • Informs intervention • Instant information • Student self-assessment • Provides information about on-going student progress • Designed & evaluated by those who know the students best • Provides a huge volume of qualitative, descriptive data Corey D. Pierce, Ph.D.

  28. Formative (Ongoing) Assessment Limitations: • Time • Informal/not standardized • Overabundance of information • May be challenging to ‘grade’ • When used to the exclusion of other types of assessment Corey D. Pierce, Ph.D.

  29. Formative (Ongoing) Assessment Essential components of effective formative assessment: • Learning Progressions: clearly articulate the sub-goals of the ultimate learning goal • Learning Goals and Criteria for Success: clearly identified and communicated to students • Descriptive Feedback: provided to students with evidence-based feedback that is linked to the intended instructional outcomes and criteria for success. CCSSO, 2008 Corey D. Pierce, Ph.D.

  30. Formative (Ongoing) Assessment • Research • Inside the Black Box: Raising Standards Through Classroom Assessment • By Paul Black and Dylan Wiliam (1998) • New assessment beliefs for a new school mission • By Rick Stiggins (2004) • Implementing Formative Assessment at the District Level: An Annotated Bibliography (New England Comprehensive Center) Corey D. Pierce, Ph.D.

  31. Formative (Ongoing) Assessment • Getting started: Academics & Behavior • Set learning goals and criteria for success • Select assessment techniques (teacher and students) • Determine how feedback is provided • Organize information from formative assessment (teacher and students) Corey D. Pierce, Ph.D.

  32. Formative (Ongoing) Assessment “Assessment FOR learning turns the classroom assessment process and its results into an instructional intervention designed to increase, not merely monitor, student learning.” Richard Stiggins Corey D. Pierce, Ph.D.

  33. Benchmarks: Progress Monitoring • Definitions • Purposes/Rationale • Strengths and Limitations • Common features • Research • Resources for getting started: Academics & Behavior Corey D. Pierce, Ph.D.

  34. Progress Monitoring: Definition Progress monitoring (PM) is a scientifically-based practice used to assess student performance and evaluate the effectiveness of instruction. Corey D. Pierce, Ph.D.

  35. PM: Purposes/Rationale • PM has two purposes: • Determine whether students are progressing appropriately from additional supports and intervention • Build more effective supports and interventions • Rationale: Use PM to closely monitor whether what we’re doing is effective! Corey D. Pierce, Ph.D.

  36. Aimline= 1.50 words/week Trendline = 0.95 words/week

  37. Gap Analysis • A critical factor in determining whether a student is making sufficient progress is conducting a Gap Analysis. Example: Benchmark vs. Current Level of Performance = Gap 90 wpm/40 wpm = 2.25 2+ = Significant Gap and signifies a need for intervention to close the Gap between student and peers

  38. PM: Strengths & Limitations High frequency Sensitive to change Guide programming more than screening May have to make your own PM tools Improper tools give invalid, unreliable results Used in isolation Strengths Limitations: How Misused Corey D. Pierce, Ph.D.

  39. Research on Progress Monitoring A substantial research literature Support a wide range of educational decisions Beginning in 1977 as Data-Based Program Modification (Deno & Mirkin, CEC) "Developments in Curriculum-Based Measurement" by S.L. Deno, 2003, The Journal of Special Education, 37. 3., 184-192. Corey D. Pierce, Ph.D.

  40. Resources for PM Frequency of difficulties in school Self-rating Parent/teacher rating Go to National Center on RtI (http://www.rti4success.org/progressMonitoringTools ) Also see The ABCs of CBM by Hosp Behavioral PM Academic Screening Corey D. Pierce, Ph.D.

  41. Available Screening and Progress Monitoring Tools

  42. How to make a CBM/Progress Monitoring Assessment • Examine curriculum to select slice of material to be covered • Plan the test • Sequence the content, behaviors, and conditions and arrange into a table • Examine table grid to determine which will be used for objectives • You may weight a column, row, or square

  43. Making a CBM cont. • Decide on the format you will use, then select or write items for the squares you identified on the table. • Establish a CAP (Criteria for Acceptable Performance).

  44. Example CBM Development Table

  45. Guidelines for identifying a good CBM • Assessment and curriculum must be aligned • Assessment instrument must be easy to use • Assessment must have a clearly defined purpose • Assessments should be standardized • Assessments should sample clearly defined domains • Assessments should sample relevant types of information • Collect raw data • Collect an adequate sample of student performance • Test should use appropriate scoring rules • Assessments should be as complex and interactive as possible

  46. Steps to Initial Implementation of Progress Monitoring • Identify classrooms or grades to measure • Determine skill/curriculum area to measure • Develop/acquire measures • Screen in the Fall • Rank students by grade/class and develop norms • Identify students at-risk as determined by capacity and cut-scores • Set year-end goals for each student • Monitor targeted students progress weekly and graph results • Evaluate progress continuously, systematically (same time, same way each time) • Make changes to instructional programming/intervention as needed • Repeat steps to create winter and spring benchmarks

  47. Progress Monitoring Self-Assessment • Review the PM Self-Assessment materials from DWW Corey D. Pierce, Ph.D.

  48. Decision Rules Based on Progress Monitoring Data • 3 to 4 data points below goal line: make an intervention change • 6 consecutive data points above goal line: goal is too low, revise upward • If neither of these apply, continue doing what you are doing • NOTE: The CDE recommends having a minimum of 8 data points before making educational decisions.

  49. Ongoing Progress Monitoring Example: • For a third-grade student with a learning disability and an IEP math goal, curriculum-based measurement (CBM) is collected each week (e.g., 25 problems sampling the 3rd-grade mathematics concepts and applications curriculum).

More Related