1 / 27

Evaluating SES Providers: A Comprehensive Model for Assessing Effectiveness

This article presents a comprehensive model for evaluating Supplemental Educational Services (SES) providers, focusing on measures of effectiveness, customer satisfaction, and service delivery compliance. The model includes various evaluation designs and measures to assess student achievement, customer perceptions, and service delivery.

bmcdonough
Télécharger la présentation

Evaluating SES Providers: A Comprehensive Model for Assessing Effectiveness

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating SES Providers Steven M. Ross Allison Potter Center for Research in Educational Policy The University of Memphis http://www.memphis.edu/crep

  2. Supplemental Educational Services (SES) • Required under No Child Left Behind (NLCB) for Title I Schools that have not made Adequate Yearly Progress (AYP) for three consecutive years. • Low-income students from identified Title I schools are eligible to receive free tutoring services. • Students are prioritized by greatest academic need if district funds are limited.

  3. Service Providers • Potential service providers apply to serve students and may be approved by the State Department of Education. • Providers contract with Local Educational Agencies (LEAs) to provide tutoring services to students. • Providers are paid for their services - an amount not to exceed the Title I per pupil allotment.

  4. Determining Evaluation Measures Effectiveness: Increased student achievement in reading/language arts or mathematics. Customer satisfaction: Positive perceptions by parents of SES students. Service delivery and compliance: Positive perceptions by principals, teachers, LEA staff, etc.

  5. Overall Provider Assessment Figure 1. Components of a Comprehensive SES/Evaluation Modeling Plan Service Delivery and Compliance District Coordinator Survey Customer Satisfaction Principal/Liaison Survey Provider Survey Teacher Survey Parent Survey Effectiveness (Student Achievement) State Tests Additional Tests

  6. Effectiveness Measures Student-level test scores from state-mandated assessments • Considerations: • availability only for certain grades (e.g., 3-higher)? • Lack of pretest scores prevents gains from being determined

  7. Effectiveness Measures 2. Supplementary individualized assessments in reading/language arts or math • Considerations: • Without pretest scores and comparison students, SES gain cannot be determined • Validity may be suspect if assessments not administered by trained independent testers

  8. Effectiveness Measures Provider-developed assessments in reading/language arts or math • Considerations: • Test results may not be valid or suitable for state’s evaluation purpose • Tests may favor provider’s strategies

  9. Customer Satisfaction Measures Parent and family perceptions • Considerations: • Parent respondents may not be representative of the population served by provider • Sample sizes will vary due to provider size • Comparisons limited due to parent familiarity with only one provider

  10. Customer Satisfaction Measures 2. Student perceptions • Considerations: • Young students may have difficulty judging quality of services and communicating impressions • Time consuming and may require parent permission to obtain

  11. Service Delivery and Compliance Measures Records of services provided, student attendance rates, and costs • Considerations: • States may obtain data from a variety of sources, including providers, teachers, principals, and district staff • Corroborating data from multiple sources can increase accuracy of evaluation conclusions

  12. Service Delivery and Compliance Measures 2. Feedback from SES customers • Considerations: • First-hand impressions or observations may be lacking • Translation may be needed to reach parents who do not speak English • Obtaining representative samples may be difficult

  13. Service Delivery and Compliance Measures 3. Feedback from district staff • Considerations: • Districts may lack firsthand impressions or observations of tutoring services • Some districts may also be SES providers

  14. Service Delivery and Compliance Measures 4. Feedback from school staff • Considerations: • Teachers may also be SES instructors or lack first-hand impressions of providers • Teachers may need to provide information on multiple providers, which may be confusing and time consuming • Identifying teachers to solicit responses may be difficult

  15. Evaluation Designs: Student Achievement A. Benchmark Comparison Rating = ++ (Low to Moderate rigor) Percentage of SES students by provider attaining “proficiency” on state assessment

  16. Evaluation Designs: Student Achievement A. Benchmark Comparison Upgrades • Percentage of SES in all performance categories (“Below Basic”, “Basic”, etc.) • Comparison of performance relative to prior year and to state norms • Comparison to a “control” sample

  17. Evaluation Designs: Student Achievement Benchmark Comparison • Advantages • Inexpensive and less demanding • Easily understood by practitioners and public • Linked directly to NCLB accountability • Disadvantages • Doesn’t control for student characteristics • Doesn’t control for schools • Uses broad achievement indices

  18. Evaluation Designs: Student Achievement B. Multiple Linear Regression Design Rating = +++ (Moderate rigor) Compares actual gains to predicted gains for students enrolled in SES, using district data to control for student variables (e.g., income, ethnicity, gender, ELL, special education status, etc.).

  19. Evaluation Designs: Student Achievement Multiple Linear Regression Design • Advantages • More costly than Benchmark, but relatively economical • Student characteristics are statistically controlled • Disadvantages • Doesn’t control for school effects • Less understandable to practitioners and public • Effect sizes may be less stable than for Model C.

  20. Evaluation Designs: Student Achievement C. Matched Samples Design Rating = ++++ (High Moderate to Strong rigor) Match and compare SES students to similar students attending same school (or, if not feasible, similar school) Use multiple matches if possible

  21. Evaluation Designs: Student Achievement C. Matched Samples Design • Advantages • Some control over school effects • Easily understood by practitioners and public • Highest potential rigor of all designs • Disadvantages • More costly and time consuming • Within-school matches may be difficult to achieve

  22. Evaluation Designs: Student Achievement D. Combination (Hybrid) Design • Uses a mixture of three main designs to meet special data situations within the State • State level analysis may be benchmark for most districts and matched samples for largest district(s) • Accommodates different student-level data and statistical staff resources

  23. Data Collection Tools • Surveys for LEAs, principals/site coordinators, teachers, parents, and providers. • Common core set of questions from all groups to permit triangulation. • Open-ended question, “Additional comments”

  24. Rubric of Overall Evaluation of Provider Effectiveness

  25. NO COMPLIANCE? YES Removal Serious? YES NO Achievement? Indeterminable Positive Negative Implementation? Minor Implementation? Negative Compliance Removal Violations? Negative Positive Positive NO YES Satisfactory Standing Probation II Probation I NO NO Full Standing Probation II Last Year? Last Year? YES YES Achievement Implementation NO Removal Improved? Improved? YES YES NO Probation I Probation II Probation I Decision Tree for SES Providers Probation I

  26. CONCLUSION • Each state should begin its SES evaluation planning process by identifying • the specific questions that its SES evaluation needs to answer, and • b) the resources that can be allocated reasonably to support further evaluation planning, data collection, analysis, reporting, and dissemination.

  27. CONCLUSION • Work through the hierarchy of evaluation designs presented here and select the design that allows the highest level of rigor. • States may wish to engage third-party evaluation experts in helping to plan and conduct these evaluations.

More Related