1 / 18

Michigan Statewide HMIS

Michigan Statewide HMIS. Performance Improvement using Outcome & Indicator Measures. About Michigan. Michigan Statewide HMIS 2.5 years old 57 CoCs including Detroit 370+ agencies with 1,322 programs 3 State Departments (DHS,DCH,DOE) Over 120,000 unique clients.

indiya
Télécharger la présentation

Michigan Statewide HMIS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Michigan Statewide HMIS Performance Improvement using Outcome & Indicator Measures

  2. About Michigan Michigan Statewide HMIS • 2.5 years old • 57 CoCs including Detroit • 370+ agencies with 1,322 programs • 3 State Departments (DHS,DCH,DOE) • Over 120,000 unique clients

  3. Michigan Vertical Configuration

  4. Measurement: An Ongoing Collaboration • Funding Organizations • Program Leadership • Staff who collect information and enter data

  5. Selecting What to Measure • Mix of measures • Process Indicators • Time to referral or placement in housing • Percentage returning for ongoing services • Short-term or Intermediate objectives - measure critical processes • Positive Housing placement at Discharge • Improved income at discharge • Linked with needed supportive services • Percent re-admissions • Outcomes – measures sustained change • Stable Housing • Stable Employment

  6. Coordinating the Process of Measurement • How do we know what success looks like? • Client Characteristics / Who is Served • When measurement occurs • Establishing Base Lines • Benchmarking / Comparing Performance • Improving Performance • The Environment / Contextual Variables • Evaluate Outcomes

  7. Defining the PopulationWho is Served • Developing shared screening questions. Data Standards plus: • Have they had a lease in their own name? • Were they homeless as children? • What is their housing history? • What is their education history? • What is their employment history? • What is their health history? • Are there disabilities of long duration? • Using a Self Sufficiency Score

  8. Intermediate Outcomes Processes/Indicators Measured during care or at discharge. Reflect stable and optimized program processes. Examples Length of time to critical referrals % of staff turnover # of re-admissions Session compliance Medication Errors Critical Incidents Customer Satisfaction Outcomes Measured after discharge for some period of time. Were changes achieved in care sustained? Did we impact other long-term primary outcomes? Examples Positive Housing Placement Improved Income Employment/School When Measurement Occurs

  9. Base Lines & Targets • Critical for evaluating change and optimizing performance • Realistic / honest targets • Stabilized Processes. • Percentiles are consistent across time • Systematic and Routine Measurement • Before, • During, and • After program change.

  10. Benchmarking • Working with “like” programs to: • Identify realistic targets • Identify performance that is off norm • Share ideas to improve • Address measurement problems • Build transparency

  11. Example Control Chart: School Attendance

  12. Defining the Measure • Purpose: To measure the Programs ability to support the educational plan of children in care. • A client is considered to have met their individual education plan for the day if they participated for 75% or more of the time they were scheduled to be involved in education activities. • Available education days reflect only days when it was possible for the client to participate in their individual education plan. For example, if the resident’s educational plan is to attend school, only those days that the school was open during the reporting period are counted. Excused absences do not count as possible days. If a client was excused due to illness, pass or an appointment, these days are not considered available education days.

  13. Creating Management Information • Cutting the performance rate to guide data driven management decisions. • Example 1: Employment: Overall is avg 50%, but for women the avg is 65%, for men it is 30%. Do you revise services to men or serve only women? • Example 2: Retention in housing: Overall is avg 72%, but for those w. substance abuse problems that number is only 15%. You conduct a study to determine barrier for those clients.

  14. The Context of Services • Lack of affordable housing • Vacancy rates • Unemployment rates • Lack of access to supporting services • Long Waiting Lists • No Insurance or Pay Source • Barriers to success or community assets

  15. Supporting Performance Improvement? • Support the development of initial measures (indicators, objectives and outcomes) and targets through meetings with funders, program leadership and staff. • Provide a menu of measures for program leadership to select from.

  16. Project Activities Continued • Provide reports that may be run monthly with a click of the button on all selected measures. The reports will aggregate data that is viewable through the HMIS for Programs, Agencies, CoCs, and Statewide. • Convene routine Benchmarking meetings to discuss measurement issues as well as performance ideas.

  17. Project Activities Continued • Maintain, with local support, a database of contextual variables. • Share/problem-solve measurement issues. Support the evolution of measures. • Provide data for statewide analysis as defined in contracts.

  18. Contact Information • Barbara Ritter • 517-230-7146 • britter@mihomeless.org • MCAH WEB SITE: www.mihomeless.org

More Related