1 / 23

Performance Measurement TJDR Peer to Peer Meeting October 7 and 10, 2011 Washington, D.C.

Performance Measurement TJDR Peer to Peer Meeting October 7 and 10, 2011 Washington, D.C. Janet Chiancone Associate Administrator for Budget and Planning Kristen Kracke Performance Measures Coordinator. Agenda. History on Federal Performance Measurement

terah
Télécharger la présentation

Performance Measurement TJDR Peer to Peer Meeting October 7 and 10, 2011 Washington, D.C.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance MeasurementTJDR Peer to Peer MeetingOctober 7 and 10, 2011Washington, D.C. Janet Chiancone Associate Administrator for Budget and Planning Kristen Kracke Performance Measures Coordinator

  2. Agenda • History on Federal Performance Measurement • Current Efforts: Where We Are Now • What Are Performance Measures • OJJDP Core Measures and TYP Data Reports • Why It Matters

  3. Federal History On Performance and Accountability Government Performance and Results Act (GPRA) Shift from accountability for process to accountability for results Programs must show effectiveness to justify funding PART Several State-level efforts also in place

  4. Current Administration Performance/Accountability a priority Economic realities make this necessary WH Position: “Chief Performance Officer” Whitehouse Websites: Performance.gov USAspending.gov “If we believe the government can make a difference in people’s lives, we have the obligation to prove that it works – by making government smarter, and leaner and more effective...” President Barack Obama April 13, 2011

  5. Performance Management: President Obama’s Administration Six identified strategies with the highest potential for achieving meaningful performance improvement within and across Federal agencies: • Driving Agency Top Priorities • Cutting Waste • Reforming Contracting • Closing the IT Gap • Promoting Accountability and Innovation through Open Government • Attracting and Motivation Top Talent

  6. Funding and Information Flows Programs need to show effectiveness to justify funding. Congress and OMB $ $ $ OJJDP Information Grantees and programs

  7. What Is Performance Measurement? A system of tracking progress of the chosen activities in accomplishing specific goals, objectives, and outcomes. Performance measurement: • Is directly related to program goals and objectives • Measures progress of the activities quantitatively • Is not exhaustive • Provides a “temperature” reading—it may not tell you everything you want to know but provides a quick and reliable gauge of selected results

  8. Evaluation vs. Performance Measurement Performance Measurement Feature Evaluation Question How much? What does it mean? Example Game score Game analysis Offers A tally Causality Timeframe Continuous (Ongoing) Interval (Discrete) Cost Less expensive More expensive Performance measurement is necessary, but not sufficient, for evaluation.

  9. What are OJJDP’s Performance Measures?

  10. Office of Juvenile Justice and Delinquency Prevention’s Charge • Authorizing legislation is the Juvenile Justice and Delinquency Prevention Act of 2002 • Focus is on helping States and localities to respond to juvenile risk behavior and delinquency • Primary function of the agency is to provide program grant funding, and support research and technical assistance/training

  11. Diversity of Programs • Formula, Block grants for States • Tribal Youth Programs • Discretionary competitive programs • Enforcing Underage Drinking Laws (block and discretionary grants) • Victimization grants (Amber Alert, internet safety) • Congressional Earmark grants

  12. OJJDP Generally Funds Four Types of Programs/Projects: • Direct Service Prevention • Direct Service Intervention • System Improvement • Research and Development

  13. Operationalizing Core Measures for OJJDP Programs • A small number of measures that directly link to OJJDP’s core directives. • Comparability within and across programs • A focus on quality services and youth outcomes

  14. OJJDP’s “Core” Measures • Percent of Program Youth who offend or reoffend • Percent of youth who are victimized

  15. Percent of Program Youth who exhibit a desired change in the targeted behavior. • Substance use • School attendance • School achievement • Social competence • Parenting • Gang activity • Cultural skill building/cultural pride [Several options – select most relevant behavior]

  16. Tribal Youth Program (TYP) Grant Core Measures Data

  17. TYP Core Measures Data: Evidence-Based Programs • Evidence based programs and practices have been defined as “programs and practices that have been shown, through rigorous evaluation and replication, to be effective at preventing or reducing juvenile delinquency or victimization, or related risk factors.” • Figure 1 presents a percentile of grantees that are implementing evidence-based programs and/or practices for the TYP grant out of 115 TYP grantees reporting. • A significant number of TYP Grantees are implementing evidence-based programs and/or practices. This is evident from the positive increase in percentage of evidence programs implemented across all reporting periods. • During the January–June 2011 reporting period, approximately 39% (n=44) of TYP grantees implemented evidence-based programs and practices, totaling $16,048,726. Figure 1. Percentage of Grantees Implementing Evidence-Based Programs and/or Practices

  18. TYP Core Measures Data: Youth Served • During the current reporting period (January to June 2011),15,355 youth and/or families were served, 83% of whom were youth (n=12,712) out of 115 TYP grants reporting. • Youth and families completed 68,841 service hours, with 92% completed by youth. • Regarding the rate of offending for program participants, we found that 2% of youth offended in the short term, and 17% of youth re-offended during the reporting period. • Reported victimization levels among youth served were also relatively low. Approximately 1% of youth tracked were victimized during the reporting period (short-term). • Similarly, reported re-victimization levels among youth served were also relatively low. Approximately 2% of youth tracked were re-victimized during the reporting period. Figure 2. Number of Program Youth Served

  19. TYP Core Measures Data: Behavioral Change • As shown in Table 1, TYP grantees were required to measure performance and track data for certain target behaviors for each program category. This table lists the short-term percentiles for the specified target behavior for all program categories. • During the January to June 2011 reporting period, 8,529 youth received services for noted targeted behaviors. • Eighty-seven percent of youth exhibited a change in behavior (see Table 1). Table 1. Target Behaviors, January–June 2011

  20. Performance Measurement • Accurate and timely reporting of performance measures data is an important element of project management • Performance measurement information is used to • Improve the operation of the program • Provide hard proof of how/when/and what your program is doing Note: OJJDP will begin performance measurement validation and verfication this year (TYP in future).

  21. Procedures for Maintaining Data • Maintain written documentation: • Electronic files (Include back-up procedures) • Hard copies (attendance sheets, for example) • Keep records as proof of reported data. • Decide who has access to the data (limited) • Develop security procedures for protecting data • Password-protected electronic files • Locked drawers for paper files • Use ID# instead of names

  22. Resources on DCTAT http://www.ojjdp-dctat.org/ For DCTAT questions contact ojjdp-dctat@csrincorporated.com Toll-free Technical Assistance Hotline Number: 1-866-487-0512

  23. Listening and Feedback: • What performance measures make sense? Which ones don’t? • What data is easiest for you to collect? What data is the hardest? • What’s missing? • What questions do you have?

More Related