Download
american institutes for research february 2005 n.
Skip this Video
Loading SlideShow in 5 Seconds..
American Institutes for Research February 2005 PowerPoint Presentation
Download Presentation
American Institutes for Research February 2005

American Institutes for Research February 2005

91 Vues Download Presentation
Télécharger la présentation

American Institutes for Research February 2005

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. The Value of Data The Vital Importance of Accountability American Institutes for Research February 2005

  2. Federal Uses of NRS Data • Develop report to Congress • Determine national progress for Program Assessment Rating Tool (PART) and Government Performance Results Act (GPRA) performance • Assess national and state trends • Monitor program outcomes and data quality • Negotiate performance targets with states • Determine whether states met prior performance targets

  3. State Uses of NRS Data • Evaluate local program performance • Promote and evaluate local program improvement efforts • Report to legislatures • Negotiate state performance targets with feds

  4. Importance of Data • Critical to Federal accountability • Supports funding • Maintains unique program identity • Performance standards: GPRA, Office of Management and Budget (OMB)

  5. Government Performance and Results Act (GPRA) • Requires annual performance targets tied to program goals for all Federal programs • Adult education’s targets part of ED’s Strategic Plan • Targets—percentage who: • Acquire basic skills to complete level (ABE, ESL) • Transition to postsecondary education • Obtain GED • Enter employment

  6. GPRA Performance, 2000–2004

  7. GPRA Performance, 2000–2004

  8. GPRA Performance, 2000–2004

  9. GPRA Performance, 2000–2004

  10. GPRA Performance, 2000–2004

  11. Program Assessment Rating Tool (PART) • OMB review process to enforce GPRA • Every program reviewed & scored annually • PART scores must be submitted to Congress with budget request • Secretaries use PART scores to increase/decrease program $ request • Congress uses PART scores to make appropriations (e.g., de-fund programs)

  12. PART (FY05) Findings for Adult Ed. • Adult education’s scores (out of 100%): • Program purpose & design (100%) • Strategic planning (29%) • Program management (67%) • Program results (0%) • Summary: “Results Not Demonstrated”

  13. Progress toward long-term goals Meet annual performance goals Demonstrate efficiency and cost effectiveness Compare favorably to other programs Independent evaluation of program effectiveness No No No No No PART “Results” Findings:

  14. PART

  15. PART • Example from appropriations committee report: “The Committee recommends no funding for [this program]. [ED] has not developed performance indicators consistent with the requirements of GPRA…. the Committee has chosen to focus its resources on higher priority programs.” p. 197

  16. From the President’s 2006 Budget: • [Reduced funding for this program] ” … is consistent with the Administration's goal of decreasing funding for programs with limited impact or for which there is little or no evidence of effectiveness. A PART analysis of the program … produced a Results Not Demonstrated rating. The program was found to have a modest impact on adult literacy, skill attainment and job placement, but data quality problems…. made it difficult to assess the program's effectiveness.”

  17. Independent Program Evaluation of Effectiveness • One element of PART where we failed, evaluation study may be coming • To show impact we need: • Good assessment data • Assessments correctly administered • Pre-post scores • Program models—program goals, approach, student participation • Meaningful instructional approach—with standards or a framework

  18. Setting State Performance Standards • Key to improving national performance • Promote continuous improvements • Problems resulting from: • High intra-state variance across years • Wide variation among states • Vastly exceed negotiated level

  19. Intra-state variance across years

  20. Wide variation among states

  21. Vastly exceed negotiated level

  22. Setting State Performance Standards • Compare with past years’ performance • Compare to national and median range • Equal or exceed actual performance • Show continuous improvement • State factors • Initiatives, policies, politics • Attendance, student variables

  23. Continued Vital Importance of Accountability Data • Situation similar to 1995 • Reauthorization—program will be evaluated • Funding—in a time of huge deficits • Renewed interest by the administration in block grants—workforce focus • Need to demonstrate value and identity of adult education program

  24. Discussion • Need for valid and reliable data to counter PART • High quality data • Improve GPRA measures for program • Continuous program improvement is essential • Demonstrate program effectiveness • What improvements are needed?