1 / 20

Preparing for Evaluation :

Preparing for Evaluation : Information on Data Collection and Research Design for SCA Statewide Recidivism Reduction grantees February 14, 2013 Council of State Governments Justice Center Mike Eisenberg, Senior Research Manager Phoebe Potter, Policy Analyst.

finna
Télécharger la présentation

Preparing for Evaluation :

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Preparing for Evaluation: Information on Data Collection and Research Design for SCA Statewide Recidivism Reduction grantees February 14, 2013 Council of State Governments Justice Center Mike Eisenberg, SeniorResearch Manager Phoebe Potter, Policy Analyst

  2. Why start planning for evaluation now? • Put data collection procedures in place • Get agreement on goals of program and methods for evaluating • Be prepared to collect individual data on participants and control group at the outset of enrollment process

  3. BJA’s Expectations around Evaluation • “In applying for these grants, lead grantees and their sub-grantees agree to cooperate in any and all related research efforts and program evaluations by collecting and providing enrollment and participation data during all years of the project.” • “Applicants further agree to implement random or other modes of participant assignment, required by the evaluation design; cooperate with all aspects of the evaluation project; and provide comparable individual-level data for comparison group members.” • “Applicants are encouraged to consider a partnership with a local research organization that can assist with data collection, performance measurement, and local evaluations.” FY 2012 SCA Statewide Recidivism Reduction Grant Solicitation

  4. Presentation Overview: Issue 1: Identifying a Target Population and Comparison Group Issue 2: Defining and Tracking Recidivism Measures Issue 3: Conducting a Process Evaluation Summary and Q&A

  5. Issue 1: Identifying a Target Population and Comparison Group Target Population Discussion: • High/moderate-high risk • Identified needs • Sample size for evaluation • What is adequate sample size for evaluation • Time to develop sample size and conduct evaluation Relevant information from the solicitation: “target population should be based on documented groups of offenders that significantly contribute to increased recidivism rates”

  6. Issue 1: Identifying a Target Population and Comparison Group Comparison Group Discussion: Tiers of research design quality Individuals that qualify for program are randomly assigned to participate, or control group • With adequate controls, may include: • Comparison to similar population in different jurisdiction • Comparison to overall population in jurisdiction not receiving services • Pre-program/post-program comparison of outcomes for the target population

  7. Issue 1: Identifying a Target Population and Comparison Group Which is better – the “real good reentry” program or the “so-so reentry” program? Recidivism Rate “Real Good Reentry” Program “So-so Reentry” Program 20% 40% ASK Are Populations Served in Each Program Comparable? These are not comparable programs

  8. Issue 1: Identifying a Target Population and Comparison Group Relevant information from the solicitation: “Applicants agree to provide comparable individual-level data for comparison group members” Recidivism Rate Comparison Group Did Not Participate in “So-so Reentry” Program “So-so Reentry” Program 40% 65% Discussion: In order to compare outcomes, you must control for some of the following factors:

  9. Presentation Overview: Issue 1: Identifying a Target Population and Comparison Group Issue 2: Defining and Tracking Recidivism Measures Issue 3: Conducting a Process Evaluation Summary and Q&A

  10. Issue 2: Defining and Tracking Recidivism Measures 1. What is the recidivism rate of program participants? You need to ask: What cohort is being tracked? What is the measure of recidivism used? Is there a uniform follow up period and what is it? For example: 1 year? 2 year? 3 year? For example: Arrest? Conviction? Reincarceration? For example: Probationers, parolees, program participants?

  11. Issue 2: Defining and Tracking Recidivism Measures 1. No national standard exists for defining recidivism 2. Agencies use a variety of definitions Arrest Conviction Return to Incarceration 3. Standard follow up periods are necessary to calculate recidivism rates Follow up matters – a one year rate will be lower than a three year Percent Return to Prison for New Offense or Revocation of Supervision One Year Tracking Period Percent Return to Prison for New Offense or Revocation of Supervision Three Year Tracking Period

  12. Issue 2: Defining and Tracking Recidivism Measures Recommended definition of recidivism for evaluation: • One year follow-up • Rearrest and Reincarceration • Misdemeanor or Felony Arrest • Reincarceration for new offense or revocation for supervision violations • Discussion: what is agency’s current recidivism definition? Relevant information from the solicitation: “For purposes of this solicitation, “recidivism” is defined in accordance with the current definition utilized by the applicant agency”

  13. Issue 2: Defining and Tracking Recidivism Measures Recidivism data and follow-up period: • What data sources are available for tracking recidivism? • Are you tracking recidivism rates of participants over a standard follow-up period? • Discussion: differences between tracking rates recommended and those in the PMT

  14. Presentation Overview: Issue 1: Identifying a Target Population and Comparison Group Issue 2: Defining and Tracking Recidivism Measures Issue 3: Conducting a Process Evaluation Summary and Q&A

  15. Issue 3: Conducting a Process Evaluation • A process evaluation can determine if a program is implemented in a way consistent with proven successful interventions. • A process evaluation should examine: • Is program utilizing a design that has previously demonstrated an ability to reduce recidivism? • Is the program being implemented as designed? • Is staff training and experience adequate to deliver program as designed

  16. Issue 3: Conducting a Process Evaluation • A process evaluation should examine (cont’d): • Are risk/needs assessed and services delivered consistent with risk and needs? • Is the delivery of these services consistent over time? Discussion: Grantee plans for conducting process evaluation

  17. Presentation Overview: Issue 1: Identifying a Target Population and Comparison Group Issue 2: Defining and Tracking Recidivism Measures Issue 3: Conducting a Process Evaluation Summary and Q&A

  18. Summary • Target Population – what matters: • Moderate-high/high risk • Identified Needs • Control Group – what matters: • Similar risk level, demographics, criminal history • Comparable measures of recidivism and follow-up period • Recidivism Measure – recommendations: • One year follow-up • Rearrest and Reincarceration • Misdemeanor or Felony Arrest • Reincarceration for new offense or revocation for supervision violations • Process evaluation - why it matters: • Ensure program design is evidence-based • Ensure program is being implemented as designed

  19. Key Quarterly Checkpoints

  20. Thank You! Questions? Mike Eisenberg Senior Research Manager meisenberg@csg.org Phoebe Potter Policy Analyst ppotter@csg.org The presentation was developed by members of the Council of State Governments Justice Center staff. Because presentations are not subject to the same rigorous review process as other printed materials, the statements made reflect the views of the authors, and should not be considered the official position of the Justice Center, the members of the Council of State Governments, or the funding agency supporting the work.

More Related