1 / 51

Results Not Demonstrated

AKA. Results Not Demonstrated. National Picture. Feb 2008 SPP and APR review Child Outcomes - Indicators C3 & B7 Family Outcomes - Indicator C4. Highlights from . . . State Approaches to Measuring Child Outcomes. All approaches have challenges . . . All approaches have challenges . . .

keena
Télécharger la présentation

Results Not Demonstrated

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AKA Results Not Demonstrated

  2. National Picture

  3. Feb 2008 SPP and APR review Child Outcomes - Indicators C3 & B7 Family Outcomes - Indicator C4 Highlights from . . .

  4. State Approaches to Measuring Child Outcomes

  5. All approaches have challenges . . .

  6. All approaches have challenges . . .

  7. All approaches have challenges . . .

  8. Number of Children Included in Feb ‘08 SPP/APR Data Part C (52) Range: 1-5944 <30 = 19 30-99= 15 100-499= 11 500-999= 4 1000+ = 3 Preschool (53) Range: 1-4249 <30 = 7 30-99 = 13 100-499= 14 500-999= 8 1000+ = 11

  9. Part C- Trends across the 3 Outcomes

  10. Part C Trends including states with N>30

  11. Preschool – Trends across the 3 Outcomes (53 out of 60 States)

  12. Preschool Trends including States with N>30

  13. Assessment Tool Trends • Part C • HELP • BDI-2 • AEPS • Carolina • ELAP • Preschool • Creative Curr • BDI-2 • Brigance • AEPS • High Scope • WSS

  14. Populations Included • Part C • 40 States statewide • 6 phasing in • 6 sampling • Preschool • 23 States statewide • 14 phasing in • 6 sampling • 5 included children in other EC programs

  15. Definitions of “near entry” • Part C • Variety of starting points- initial IFSP the most common reference point • Earliest: as part of intake, or with eligibility determination • Latest: w/in 6 months of enrollment • Preschool • Wide variation • From 30 days to 4 months from entry • States using ongoing assessments used “fall” data point for entry

  16. Definitions of “near exit” • Preschool • About two thirds provided definition • Ranged from 30 days to 6 months • Included spring assessment points and “end of the school year” • Part C • About half defined near exit • Typically within 30 to 60 days of exit

  17. Criteria for same aged peers • COSF- 6 or 7 on the scale, by definition • Single tool statewide- variation in criteria across states; e.g. BDI 1.3 SD from mean for 2 states and 1.5 SD from mean for another state • Publishers analysis of data intended to correspond to COSF summary ratings

  18. Only represents children who have entered and exited since outcome system put in place in states In a typical state, data may represent children who participated in the program for 6 to 12 months The quality of data collection usually increases over time as guidance gets clearer and practice improves the implementation Caution – Interpreting Data

  19. First, focus on progress categories “a” and “e” Should reflect the characteristics of the children served in the state (e.g. eligibility definition in Part C) Should be fairly stable over time (when data are high quality and representative) Scanning Your Data for Unusual Patterns

  20. Percents too high? Should represent children with very significant delays or degenerative conditions (any improvement in functioning puts a child into “b”) Why it may be too high Decision rules based on different interpretation of “no progress” Tools without enough discrimination to show small amounts of progress Checking Category “a”

  21. Percents too high or low? Should represent children functioning at age expectations at entry and exit in each outcome area Do your patterns make sense for each outcome based on the children served in the state? Why it may be too high or low Decision rules based on different interpretation of “age expectations” Checking Category “e”

  22. The n is too small The overall pattern in the data looks ‘strange’: Compared to what you expect Compared to other data Compared to similar states The data is not representative: Of all areas of the state Of all kinds of families Of all kinds of children The validity of your data is questionable if…

  23. Almost all states (Part C and 619) are conducting training and professional development: Assessment strategies Data collection procedures Data analysis and use (and a little bit of) Practices to improve child outcomes Improvement Activities

  24. Improving infrastructure for providing TA and support Conducting Evaluation Reviewing data for accuracy and quality Reviewing and revising processes Identifying successes and challenges in the implementation of the outcomes system Improving data collection and reporting Improvement Activities

  25. Results Not Demonstrated

  26. Family Outcomes

  27. Part C Tools for Family Outcomes 28 (52%) reported that they provided translations and/or translation services to assist families

  28. Population

  29. Variations in Target Populations 5 states did not report on criteria for the population

  30. Representativeness

  31. Timeframes for Data Collection

  32. Overall Trends for Part C Family Outcomes

  33. Response Rate Variation

  34. Clarifying and developing policies and procedures (40 states) clarification of policies regarding family rights and family centered services modifications to procedures related to the implementation of family surveys Providing training and professional support (28 states) to providers and service coordinators regarding family rights and procedural safeguards effective practices relating to family centered services understanding the procedures for implementing the measurement of family outcomes and understanding and using the family survey data for program improvement Trends in Improvement Activities

  35. Conducting evaluation (27 states) evaluating the processes used to implement family outcome measurement in FFY 2005 (including distribution methods, follow-up, methods of analysis family focus groups or random interviews with families to validate outcomes data Improving data collection and reporting (25 states) developing strategies for improving the family survey response rates and representativeness of the data Trends in Improvement Activities

  36. Tools used for B8: Preschool Parent Involvement

  37. Results Not Demonstrated

  38. Themes and Agenda

  39. Setting Targets Improving Data Quality Training & TA Capacity Written policies and procedures Analysis and interpretation of the data Quality Assurance / Monitoring Improvement Planning – for better data collection and for improved outcomes Preparing for the Future

  40. Child Outcomes

  41. Family Outcomes Child Outcomes

  42. Teacher/Provider Skills Family Outcomes Child Outcomes

  43. Program/Classroom Quality Teacher/Provider Skills Family Outcomes Child Outcomes

  44. System Program/Classroom Quality Teacher/Provider Skills Family Outcomes Child Outcomes

  45. Quality Assurance Quality assessment data Reliable use of tools Quality of analysis and reporting Training and TA (to address quality) Collaboration Part C and 619 Preschool Across Early Care and Education Themes of Agenda Sessions

  46. Challenges of particular approaches Decision rules for “age expectations” and progress category assignment for states using one tool statewide Consistent interpretation and use of the COSF Outcomes from the local and family perspectives Themes of Agenda Sessions

More Related