1 / 17

Challenges of Measuring Employment Program Performance

Challenges of Measuring Employment Program Performance. William S. Borden November, 2009. Topics . Effective performance management Goals and definitions of measurement and measures Impact of performance system on behavior Methods for obtaining reliable data Stakeholder input

Télécharger la présentation

Challenges of Measuring Employment Program Performance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Challenges of Measuring Employment Program Performance William S. Borden November, 2009

  2. Topics • Effective performance management • Goals and definitions of measurement and measures • Impact of performance system on behavior • Methods for obtaining reliable data • Stakeholder input • Fear and burden • Accountability and complexity • WIA performance measures

  3. Operational Challenges of Performance Management • Designing and implementing national performance systems involves different set of tools than research or policy • Effective government performance management based on software development methods • High value data requires precise and objective definitions, detailed documentation, sound software development and testing practices • Highly fragmented national management information systems, imprecise definitions and lack of motivation to increase performance outcomes poses risk to data quality

  4. Comprehensive View of Employment Programs • Legitimate discussion on value of specialized service delivery programs for special populations • Elderly poor • Disadvantaged youth • People with disabilities • Veterans • Overlapping programs present comparability challenge • Assessing relative effectiveness versus mainline programs • Service delivery fragmentation leads to reduced management and data capacity and resistance to increased burden • Economies of scale reduce management capacity

  5. Effective Performance Management • Performance data can provide essential management information for all program levels • Good performance management process is necessary foundation for research evaluations (otherwise data will be unreliable) • Very involved technical process • Information is not useful without • Precisely defined and objective measures and data elements • Extensive technical documentation • Standardized automated edits and calculations • Extensive software testing

  6. Effective Performance Management Lowers Costs • Upfront investment in well-defined measures, data elements, measure calculations and standardized tools • Investments are leveraged across all levels of system • Much more accurate, timely and useful data • Careful initial planning reduces the need to redesign and rebuild systems – fewer rounds of stakeholder input • Inconsistent and unreliable data are not cost effective

  7. Market Related Goals of Performance Management • Determine program effectiveness, return on public investment • De-fund ineffective programs • Provide incentives for high performance

  8. Limitations of Market Motives • Competition, profit and loss translate poorly to government program evaluation • Defining goals is difficult • Performance-based budgeting is ultimate market mechanism • Requires very precise and accurate data • Provides maximum incentive for inappropriate behaviors (creaming, manipulating enrollment, exit and exclusion data) • Public programs have natural geographic and political monopolies (hard to defund Ohio and send customers to Michigan)

  9. Goals of Performance as a Management Tool • Understand basic facts about programs • Customers served • Services provided • Results • Detect superior and inferior performance and associated service delivery approaches • Act on findings by implementing remedial steps • Identify and assimilate best practices • Analyze performance trends

  10. Defining Measures • Measures must generate rates of success and not counts • Must be able to track performance trends over time • Compare performance across operating units • Outcome measures better than process measures • Intermediate measures of progress needed if customers are in services for a long time • Standards needed to identify acceptable and unacceptable performance • Must be adjusted to account for differences in customers and labor markets

  11. Obtaining Reliable Data • ETA has strong data validation system – WIA, NFJP, TAA, ES, UI • Based on long history of performance measurement and data validation in Unemployment Insurance program • Uniform national standards and software to edit, calculate and validate data • Hard to define and document what makes data valid – how to document homeless youth? • UI has standard for data quality based on review of sample cases (and incorporating standard error) • No data quality standards for employment training programs and no calculation of standard error

  12. Manipulating Performance • Difficult to define enrollment, exit, employment and earnings • These data elements drive the calculations • Some states cut enrollment in response to WIA to manage flow of customers into performance measures • Issue of responsibility for self-service customers • How valid to measure impact of such a small intervention, but there were large infrastructure costs • Many customers never exited from JTPA • WIA created “soft exit” – no services for 90 days so that everyone would be counted • Try to negotiate lowest possible goals to allow for improvement

  13. Accountability and Complexity • Stakeholders do not want to be accountable for circumstances beyond their control • Customers “disappear” and become negative outcomes • These situations should occur randomly and evenly across states or grantees • If one state had a significantly higher percentage – might indicate flaws • Exclusions from performance – death, illness, incarceration • Death is the most simple– exclude record from performance • Illness and family member illness is very subjective – documentation is difficult – more prevalent and problematic in older worker program • All of these factors greatly increase complexity of measures • Stakeholders then complain that measures are too complex

  14. Stakeholder Involvement • Almost all measures derive from legislation • Agencies must develop operational definitions, calculations • Inputs from states, grantees and local areas is valuable • They have strong knowledge of issues with the data • Their buy-in is critical • for acceptance of rewards and sanctions • For them to use performance data as a management tool • Resistance to measures, especially where management capacity is deficient • Strong centralized leadership and effective communication of goals and methods is essential

  15. Fear and Burden • Considerable fear of performance measures • First reaction is to complain about the burden • Reporting burden is exaggerated; performance reporting uses data agencies already track for program management • Follow up data is largest burden; can replace with wage records • Data validation is large burden for family income, homelessness, health performance exclusions • Shifting focus from service delivery to making the numbers

  16. WIA Performance Measures • UI wage records are key to objective measurement of program outcomes • Long lags are a problem for prompt feedback to program operators • Effort involved to get national wage file including federal and military employment • Measuring earnings gain has been problematic • Pre-to-post program ratio distorted by pre-enrollment earnings gaps • Skill and credential attainment rates were ill-defined • Reluctance to develop precise definitions • No usable data • New measures much better • Diploma or certificate and literacy and numeracy • Standardized, well-defined, very complex to calculate and test

  17. Conclusion • Measures and data elements are hard to define and validate • Risky to draw strong conclusions from performance data • Emphasis on sanctions and defunding may promote inappropriate behavior • Emphasis on management information and detection of problem areas promotes improvement and cooperation • Need to invest in technical infrastructure, standardization to achieve reliable and comparable results

More Related