1 / 110

Christina Kasprzak Austin, Texas November 2010

Analyzing and Interpreting Child Outcomes Data. Christina Kasprzak Austin, Texas November 2010. Objective for the day. To share with you ideas and resources for use in training and TA that will help districts to analyze and use COSF data. 2. Agenda.

amora
Télécharger la présentation

Christina Kasprzak Austin, Texas November 2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Analyzing and Interpreting Child Outcomes Data Christina KasprzakAustin, TexasNovember 2010

  2. Objective for the day To share with you ideas and resources for use in training and TA that will help districts to analyze and use COSF data 2

  3. Agenda Looking at data—generally; national; state; regional Follow up discussion about assessment tools Communicating data results Public reporting requirements Framework for a quality outcomes system

  4. Assessment (more debrief on this after lunch) no assessment created for this outcomes process best practices on assessment = multiple data sources types of assessment including pros and cons benefits of limiting assessments for COSF selecting tools for COSF process activity – reviewing assessment tools and identifying strengths, weaknesses, how it fits with COSF process Recap from March

  5. Recap from March • Promoting Data Quality – ECO Training Materials and Activities • COSF refresher training • quality review of COSF team discussion • involving families in outcomes process • written child example • reviewing a COSF for quality

  6. Why do a good job with COSF data? It’s hard to change attitudes! What motivates people? Altruistic? Fear? Logic? Money?

  7. Why do a good job with COSF data? Altruistic: Because you believe child and family outcomes are why you do your job! Fear: Because you can look bad! (to the state; to the public via public reporting) Logic: Because a program should be accountable for the results of their services! Money: Because OMB is using the data to make decisions– federal dollars are at stake!

  8. Why do a good job with COSF data? • Today’s focus on ‘looking at data’ will give you more tools and resources for changing attitudes!

  9. Looking at Data 9

  10. Continuous Program Improvement Reflect Are we where we want to be? Check (Collect and analyze data) Plan (vision) Program characteristics Child and family outcomes Implement 10

  11. Using data for program improvement = EIA 11 Evidence Inference Action

  12. Evidence • Evidence refers to the numbers, such as “45% of children in category b” • The numbers are not debatable 12

  13. Inference 13 How do you interpret the #s? What can you conclude from the #s? Does evidence mean good news? Bad news? News we can’t interpret? To reach an inference, sometimes we analyze data in other ways (ask for more evidence)

  14. Inference 14 Inference is debatable -- even reasonable people can reach different conclusions Stakeholders can help with putting meaning on the numbers Early on, the inference may be more a question of the quality of the data

  15. Action 15 Given the inference from the numbers, what should be done? Recommendations or action steps Action can be debatable – and often is Another role for stakeholders Again, early on the action might have to do with improving the quality of the data

  16. Promoting quality data through data analysis 16

  17. Examine the data for inconsistencies If/when you find something strange, look for other data that might help explain it. Is the variation caused by something other than bad data? Promoting quality data through data analysis 17

  18. The validity of your data is questionable if… The overall pattern in the data looks “strange’: • Compared to what you expect • Compared to other data • Compared to similar states/regions/school districts 18

  19. Let’s look at some data … 19

  20. Remember: Part C &619 Child Outcomes (see cheat sheet) 1. Positive social-emotional skills (including social relationships); 2. Acquisition and use of knowledge and skills (including early language/communication [and early literacy]); and 3. Use of appropriate behaviors to meet their needs

  21. Remember: COSF 7-point scale • 7-Completely- Age appropriate functioning in all or almost all everyday situations; no concerns • 6- Age appropriate functioning, some significant concerns • 5-Somewhat- Age appropriate functioning some of the time and/or in some settings and situations • 4- Occasional age-appropriate functioning across settings and situations; more functioning is notage-appropriate than age appropriate. • 3-Nearly- Not yetage appropriate functioning; immediate foundational skillsmost or all of the time • 2- Occasional use of immediate foundational skills • 1-Not yet- Not yetage appropriate functioning or immediate foundational skills

  22. COSF Ratings – Outcome 1 Entry data (fake data) 22

  23. Frequency on Outcome 1 – Statewide Entry Data 23

  24. COSF Ratings – Outcome 1 Entry data (fake data) 24

  25. COSF Ratings – Outcome 1 Entry data (fake data) 25

  26. Comparison of two Groups

  27. Average Entry Scores on Outcomes

  28. Outcome 3: Appropriate Action (fake data) 100

  29. Remember: Reporting Categories Percentage of children who: a. Did not improve functioning b. Improved functioning, but not sufficient to move nearer to functioning comparable to same-aged peers c. Improved functioning to a level nearer to same-aged peers but did not reach it d. Improved functioning to reach a level comparable to same-aged peers e. Maintained functioning at a level comparable to same-aged peers 3 outcomes x 5 “measures” = 15 numbers

  30. Progress Data – Outcome 2: fake data 30

  31. Progress Data – Outcome 2: fake data 31

  32. Final results • Using the row percents we know that 35% of children in Ms Mary’s programs closed the gap in Outcome 1. • As a reference, we can compare this to the 20% of children across all programs that closed the gap in Outcome 1. • Why? Is this an important difference? • To answer that question we would conduct additional analysis

  33. Do the data make sense? Am I surprised? Do I believe the data? Believe some of the data? All of the data? If the data are reasonable (or when they become reasonable), what might they tell us? Questions to ask 36

  34. Examining COSF data at one time point • One group - Frequency Distribution • Tables • Graphs • Comparing Groups • Graphs • Averages

  35. What we’ve looked at: Do outcomes vary by: • Unit/District/Program? • Rating at Entry? • Amount of movement on the scale? • % in the various progress categories?

  36. What else might you want to look at? 39 Do outcomes vary by child/family variables or by service variables, e.g. : • Services received? • Age at entry to service? • Type of services received? • Family outcomes? • Education level of parent?

  37. Activity 1: Reviewing sample data

  38. Small Groups Break into small groups of ~5 Walk through the state example answering questions as you go Whole group: share highlights of your conversations

  39. Application How could you use this type of data discussion in your training and TA? What experiences or resources do you have with discussing outcomes data in your training and TA?

  40. Summary Statements

  41. Origin of the Summary Statements • States reported on the OSEP Progress Categories for a few years • States knew they would be asked to set targets • Using the progress categories would require setting 15 targets…

  42. Origin of the Summary Statements • ECO prepared papers with options • Convened stakeholders • Extensive discussion about pros and cons of various summary statements • See Options and ECO Recommendations for Summary Statements for Target Setting on the ECO web site: http://www.fpg.unc.edu/~eco/assets/pdfs/summary_of_target_setting-2.pdf

  43. Summary Statements • Of those children who entered the program below age expectations in each Outcome, the percent who substantially increased their rate of growth by the time they turned 6 years of age or exited the program. • The percent of children who were functioning within age expectations in each Outcome by the time they turned 6 years of age or exited the program.

  44. Summary Statements • Of those children who entered the program below age expectations in each Outcome, the percent who substantially increased their rate of growth by the time they exited the program. c + d___ a + b + c + d

  45. Other Ways to Think about Summary Statement 1 • How many children changed growth trajectories during their time in the program? • Percent of the children who entered the program below age expectations made greater than expected gains, made substantial increases in their rates of growth, i.e. changed their growth trajectories

  46. Summary Statements • The percent of children who were functioning within age expectations in each Outcome by the time they exited the program. d + e__ a + b + c + d + e

  47. Other Ways to Think about Summary Statement 2 • How many children were functioning like same aged peers when they left the program? • Percent of the children who were functioning at age expectations in this outcome area when they exited the program, including those who: • started out behind and caught up and • entered and exited at age level

More Related