140 likes | 253 Vues
This document provides an in-depth exploration of performance indicators for community-oriented defense practices, as presented by Michael Rempel at the Community-Oriented Defense Network Conference in July 2009. It emphasizes the significance of defining project goals and objectives, particularly focusing on rehabilitation, community engagement, and mitigating the collateral consequences of convictions. Key concepts include SMART objectives, distinctions between process and impact indicators, and examples from drug courts and juvenile accountability programs. This overview serves as a guide for implementation and evaluation in community-oriented defense initiatives.
E N D
Community-Oriented Defense Performance Indicators A Conceptual Overview Michael Rempel Center for Court Innovation Presented at the Community-Oriented Defense Network Conference, “Performance Measures - Making the Case for Community-Oriented Defense,” New York, NY, July 23, 2009
Project Goals • Goals: define the overall mission or purpose of the project; provide a “definition of success.” • Community-Oriented Defense Examples: • Rehabilitation: address defendant’s underlying problems • Collateral Consequences: mitigate adverse effects of conviction • Community Engagement: seek community input in programming
Project Objectives Objectives: support the goals and explain exactly how they will be accomplished. Objectives are SMART! Specific Pertain to a certain task or program Measurable Quantifiable Achievable Doable within constraints (e.g., financial or staffing resources) Results-Oriented Focused on short-term activities to attain longer-term goals Time-bound Include a date by which the objective must be completed
Performance Indicators Objectives translate into Performance Indicators. • Quantitative: • Number (#) • Percent (%) • Yes or no (y/n): Something happened or not • Feasible: • Relevant data can be captured/tracked • Appropriate control group can be identified (if necessary)
Process vs. Impact Indicators • Process Indicators: Did the intended program activities take place? • Failure of Implementation: program model not implemented as designed (few clients, intended services not delivered, best practices not followed, compliance not monitored, etc.) • The Bronx Juvenile Accountability Court • Impact Indicators: Did the program have the intended effects? • Failure of Design: program model implemented as designed, but model’s theory of change was flawed • Batterer programs for domestic violence offenders
Role of Control Groups • Process Indicators: Control group unnecessary (just measure whether program included the intended elements – volume, services, staffing, etc.) • Impact Indicators: Control group essential (cannot determine impact in absence of comparison) • Did program reduce recidivism if participants were arrested less often than before participation began? • Did program reduce recidivism if completers were arrested less often than dropouts? • Did program reduce recidivism if participants were arrested less often than a control group?
Sample Goal: Rehabilitation • Assessment: Determine each client’s individual needs • Percent (%) of all clients screened or assessed for problems • Referrals: Refer more clients to treatment/services: • Total # and % of clients referred for onsite or outside services • Breakdowns by service type: e.g., substance abuse treatment, mental health treatment, employment services, GED classes • Dosage: Increase treatment dosage that clients receive: • Show-up rate: Percent (%) of referrals at first appointment • Program completion rate (%) • Crime/Delinquency: Produce recidivism reduction: • Percent (%) re-arrested in one year: participants vs. controls
Drug Courts: Framework • Mission: Link drug-addicted defendants to court-supervised treatment as alternative to incarceration. • Content (typical): • Referral to community-based treatment • Ongoing judicial oversight (frequent judicial status hearings, drug testing, case management, sanctions, rewards) • Program duration at least one year • Major Goals (typical): • Recidivism reduction • Offender rehabilitation (primarily via reduced drug use) • Cost savings (to criminal justice system, crime victims, etc.)
Drug Courts: Key Process Indicators • Volume: Indicates “reach” (how many can benefit?) • # cases referred to drug court • # and % of referrals becoming program participants • Processing Speed: Indicates “immediacy” (are addicted defendants rapidly placed and engaged?) • Average # days from arrest to intake • Average # days from intake to first treatment placement • Retention: Indicates engagement (what % became invested in recovery?) and dosage (was it sufficient?) • One-year retention rate (%): percent of participants that graduated or were still active one year after beginning
Drug Courts: Key Impact Indicators • Recidivism Rate: Did the program ultimately reduce recidivism? • Re-arrest rates after 1, 2, or 3 years • Re-conviction rates after 1, 2, or 3 years • Drug Use: Did the program ultimately foster recovery? • Drug test results or self-reported use after 1, 2, or 3 years
Impacts #1: NYS Drug Courts on Recidivism Source: Rempel et al (2003).
Impacts #1: USA Drug Courts on Drug Use Source: Rossman and Rempel (2009).
Funding Considerations • General Rules: • Do anecdotes help? No • Must quantitative indicators be included? Yes • Common Types of Indicators: • Bean Counting: Actual vs. target volume • Fidelity Measures: % implemented of all proposed activities • Completion Rates: Actual vs. target completion rates • Recidivism Rates: Actual vs. target re-arrest rates • The Court Perspective: • Recidivism, recidivism, recidivism • Cost savings
Exercise: Goals to Indicators • Group Exercise: • Identify goals of Community-Oriented Defense Programs • State which objectives follow from the identified goals • Develop specific and quantifiable performance indicators • Tip: Begin each indicator with words like: “percent,” “number,” or “average” (i.e., make sure it is quantitative) • Reality Check: Do your identified goals, objectives, and indicators relate to your actual program activities? • Bonus Questions: What data do you need to obtain your indicators? How can you obtain it?