1 / 23

OSEP 2009 Program Evaluation

OSEP 2009 Program Evaluation. 2009 OSEP Project Director’s Meeting July 20 th , 2009. The Process we Used and What we Learned…. Renee Bradley, Judy Shanley – OSEP Herb Baum, Danielle Schwarzmann – ICF Macro. Our Process this Year…. Thank you to those projects who participated

casey-perez
Télécharger la présentation

OSEP 2009 Program Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OSEP 2009 Program Evaluation 2009 OSEP Project Director’s Meeting July 20th, 2009 The Process we Used and What we Learned…. Renee Bradley, Judy Shanley – OSEP Herb Baum, Danielle Schwarzmann – ICF Macro

  2. Our Process this Year… Thank you to those projects who participated It’s never too early to think about Performance Measurement… Performance Measures: Under the Government Performance and Results Act of 1993 (GPRA), the Department has established a set of performance measures, including long-term measures, that are designed to yield information on various aspects of the effectiveness and quality of the Technical Assistance and Dissemination to Improve Services and Results for Children with Disabilities program. These measures focus on the extent to which projects provide high quality products and services, the relevance of project products and services to educational and early intervention policy and practice, and the use of products and services to improve educational and early intervention policy and practice. The grantee will be required to provide information related to these measures. The grantee also will be required to report information on the project’s performance in annual reports to the Department (34 CFR 75.590). • Applications, work scope, reporting requirements

  3. Why Program Evaluation… • Provides aggregate picture of our performance • Affects potentially, future program funding • Supports profession of TA&D • Enhances legitimacy of the field of TA&D

  4. Measures • Annual • Quality, usefulness, relevance, cost (efficiency) • Long-term • Implementation of evidence-based practices • Implementation of model demonstrations

  5. How is Quality Rated by Panel? Annual Measure Percentage of Special Education Technical Assistance and Dissemination (TA&D) Products and services deemed to be of high quality by an independent (science) review panel (annual). • Substance: • Does the product or service reflect an evidence-based approach or one grounded in current legislation or policy? • Communication: • Is the presentation of the product or service description clear, well-formatted, organized?

  6. How is Usefulness Rated by the Panel? Annual Measure Percentage of Special Education Technical Assistance and Dissemination (TA&D) Products and services deemed to be of useful by an independent (science) review panel (annual). • Ease: • Is the product or service description easily understood, providing guidance and direction? • Replicability: • Will the product or service eventually be used by the target group to achieve the benefit intended? • Sustainability: • Will the product or service eventually be used in more than one setting and over time?

  7. How is Relevance Rated by the Panel? Annual Measure Percentage of Special Education Technical Assistance and Dissemination (TA&D) Products and services deemed to be of relevant by an independent (science) review panel (annual). • Need: • Does the product or service solve an important problem or address a critical issue? • Pertinence: • Does the product or service relate directly to the problem or issue facing the targeted group? • Reach: • Does the product or service apply to diverse populations within the target group?

  8. How is Cost Defined? • OMB requests each Federal program to provide information on cost. • Cost is defined as the $ per unit of output. • Cost includes labor and other direct costs. • Cost includes only those expenses incurred by Federal project. • For TA&D this is the cost of producing the ‘best’ practice/service.

  9. Long Term Measures • Implemented this year for TA&D program for the first time • Twenty-two TA&D projects and 10 State Deaf-blind projects randomly selected • Two Long-term Measures: • Implementation of Evidence-Based Practices • Implementation of Model-Demonstration Projects

  10. TA&D Program’s Performance Measures – Long-term Measure • Percentage of school districts and service agencies receiving Special-Education TA&D services regarding scientifically or evidenced-based practices for infants, toddlers, children and youth with disabilities that implement those practices (long-term).

  11. TA & D Program’s Performance Measures - Long-term Measure • The percentage of TA&D projects responsible for developing models that identify, implement, and evaluate effective models (long-term).

  12. The Method… Dr. Herbert Baum, Danielle Schwarzmann ICF Macro

  13. Selection and Data Collection • Selection • Random Selection • Using placemat • Additional sites suggested by OSEP to ensure Evidence—based practice areas were represented • Data Collection • Initial e-mail • Follow-up/ Reminder e-mails • Phone calls when necessary

  14. Process for Quality/ Relevance/ Usefulness Evaluation • E-mail requesting ‘best’ product/service (including costs), and list of services/products • Randomly selected a product/service as a ‘typical’ submission • Requested the program complete the ‘typical’ description

  15. Process for Quality/ Relevance/ Usefulness Evaluation • Products/services are evaluated for 3 criteria • Quality • Substance • Communication • Relevance • Need • Pertinence • Reach • Usefulness • Ease • Likelihood of Use • Replicability

  16. Process for Cost Evaluation • OMB requests each federal program to provide information on cost. • Cost is defined as the $ per unit of output. • For TA&D this is the cost of producing the ‘best’ practice/service.

  17. Process for Measure 1.1 – Long-term Measure: Implementation of Evidence-based PracticesEvaluation • TA&D centers submitted forms indicating their practices/programs and where they were being implemented • ICF Macro chose a practice, and a place of implementation to complete a form

  18. Response Rates

  19. Lessons Learned & Implications for Grantees - Next Year • Consider using electronic submissions - may not need to mail hard copies • You save money on shipping • Decreases response time • Trees are saved • Submit accessible products - Diminish use of copywritten products as exemplars • Products – not links to websites must be submitted • Talk with your Project Officer and TA recipients about the Program Evaluation

  20. Timeline for 2010 Annual Measures • April • Final review of methodology by OSEP • Preparation of protocol materials • May • Select sample of projects • Send requests to projects

  21. Timeline for 2010 Annual measures • June • Obtain description of ‘best” and “typical”” product and service • Obtain lists of products and services • Obtain products • Obtain cost data • July - Review by expert panels • August – Generate Measures • September – Report finings and recommendations to OSEP

  22. Lessons Learned & Implications for OSEP - Next Year • Collect feedback from 2009 samples and use to improve process • Share results • Discuss with grantees in kick-off meetings and during the year • Explore ways to integrate content from annual continuation reports into the annual program evaluation • Learn from other Federal agencies • Support and engage OSEP Project Officers

  23. Feedback • Comments, Questions, Concerns? • What could we do to improve our process? • What could we do to enhance definitions and instructions? • What can we do to make life easier for you?

More Related