1 / 62

Measuring Child and Family Outcomes

Measuring Child and Family Outcomes. Session Two Gathering Child Outcome Results in Maryland. Anne Brager, MS, RN Program Supervisor Frederick County Infants and Toddlers Program. Age of Accountability.

Télécharger la présentation

Measuring Child and Family Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring Child and Family Outcomes Session Two Gathering Child Outcome Results in Maryland Anne Brager, MS, RN Program Supervisor Frederick County Infants and Toddlers Program

  2. Age of Accountability • The twentieth century has been called the Age of Accountability and the 2004 Reauthorization of Individuals with Disabilities Education Improvement Act (IDEA), Part C is representative of this fact. • This legislation requires that states move to a new level of accountability for early intervention service systems

  3. A Reminder of Why OSEP’s Focus on Early Childhood Outcomes? • The ultimate goal of an outcomes measurement system is to improve results for young children with disabilities and their families! • To address the Government Performance and Results Act (GPRA), Program Assessment Rating Tool (PART), and IDEA 2004 requirements

  4. Goal of Early Intervention • “…To enable young children to be active and successful participants during the early childhood years and in the future in a variety of settings – in their homes with their families, in child care, in preschool or school programs, and in the community.” (from Early Childhood Outcomes Center, http://www.fpg.unc.edu/~eco/pdfs/eco_outcomes_4-13-05.pdf)

  5. What Were the OSEP Reporting Requirements For Part C and Preschool Child Outcomes? Data is to be reported on the following outcomes: The percent of children who demonstrate improved: • Positive social emotional skills (including positive social relationships) • Acquisition and use of knowledge and skills (including early language/ communication [and early literacy]) • Use of appropriate behaviors to meet their needs

  6. OSEP Reporting Categories Data from individual children will be aggregated and grouped into categories for reporting to the Office for Special Education Programs (OSEP): a. % of children who did not improve functioning b. % of children who improved functioning but not sufficient to move nearer to functioning comparable to same-aged peers c. % of children who improved functioning to a level nearer to same-aged peers but did not reach it d. % of children who improved functioning to reach a level comparable to same-aged peers e. % of children who maintained functioning at a level comparable to same-aged peers

  7. Why Measure Child Outcomes? We need to know how young children with disabilities are benefiting from early intervention and preschool special education: • Federal Reporting Requirements All States must report data to OSEP Impact on federal funding • Program Effectiveness Evidence for State and local policymakers, sponsors See program as good investment See program as important with some challenges that must be addressed • Program Improvement Find meaning in the data to make systemic changes Use data to improve IFSP services to individual/groups of children

  8. How Will States Get the Data? States had many things to consider when developing their plan to gather the needed Child Outcomes data: • Who will provide the data? • What assessments will be used? • How often will data be collected? • When is data collected? • When is it reported? • Dealing with multiple sources? • Dealing with different assessments?

  9. A Review of the Origin of Child Outcomes • In response to reporting requirements OSEP funded the Early Childhood Outcomes Center (ECO), a five year project designed to help states answer those questions. • ECO Center held stakeholder meetings followed by public comment period • First, collected themes and ideas • Then, drafted and re-drafted outcome wording

  10. Some of the common Themes from the Stakeholder Meetings • Early Childhood Outcomes should apply to the entire birth through 5 age span • They should apply to all children with disabilities; there should not be separate outcomes for blind, deaf, etc. • Assessing outcomes has the potential to influence practice in a positive way • Changing the way we write outcomes, shift to writing functional outcomes, not domain-based outcomes • Best practice –kind of outcomes recommended for IFSPs and IEPs • Reflects transdisciplinary service delivery • Challenge: Not captured well in current domains-based assessment tools

  11. State Approaches Related to Assessment Tools • There is no ONE assessment tool that assesses the three OSEP Child Outcomes areas directly • Some states are choosing one assessment selected by state • Some states have developed a list of assessments developed by the state that programs can pick from • While other states are allowing programs to use whatever they have been using • Maryland has provided a list of the five most commonly used assessments

  12. Dilemma: How do States Aggregate Data from Different Assessments? Remember states are trying to use domains based data programs to report on the three outcomes: • ECO developed the COSF (Child Outcomes Summary Form) to provide a common metric • Form also provides a way to summarize multiple sources of information on a single child • The majority of states are using the COSF to produce state level child outcomes data

  13. How does Maryland Aggregate Data? Child enters LITP at 20 months Data is extracted from the information that has been entered into the Online IFSP-- Present Levels of Development Child exits LITP at 36 months Data is extracted from the information that has been entered into the Online IFSP-- “New Section” (similar to current Present Levels of Development) and Compared to Entry Data To determine “progress.” Progress At Exit Data Status At Entry Data

  14. Rationale for Maryland’s Approach • Desire to align outcome process with the IFSP process • Focus on improving evaluation and assessment practices (Tutorial) • Focus on ensuring data is collected in all domains (Monitoring) • Have a data system that collects Present Levels of Development (PLOD) • Can get started by generating electronic reports from data entered into PLOD • Response to local input

  15. How Do We Get The Data? (for Status At Entry and Progress at Exit)Data Extracted from Present Levels of Development and Electronically Linked to 3 Outcomes to Produce Answers Alignment of broad outcomes to Present Levels of Development Is the child’s acquisition and use of knowledge and skills (including earlylanguage/communication) at the level expected for his or her age?  ___Yes _____No Are child’s social-emotional skills (including social relationships) at the level expected for his or her age?  ___Yes _____No Based on assessment and other information, does the child use appropriate behavior to meet his or her needs at the level expectedfor his or her age?___Yes _____No

  16. Protocols for Linking Age Levels/Age Ranges with Outcomes • For the outcome “acquisition and use of knowledge and skills (including early language/communication),” two domain categories (cognitive and communication) will be used. If both domains have quantitative data, the category that has the lowest range of data will be used. • When an age range has been entered, the midpoint of the range will be used.

  17. Frequency of Data Collection • Some states are measuring outcomes only at entry and exit (OSEP’s minimal requirement) • This is the approach Maryland is taking! • Some states are measuring more often (allows for more meaningful information) • The current status of a state’s data system is a huge factor in frequency of data collection

  18. Trends in Approaches to Measurement for Part C Child Outcomes • 40 states are currently using the ECO Child Outcome Summary Form • A 7 point rating scale based on multiple sources of data, often including assessment tools, observation, family report • 8 states using 1 assessment tool statewide • BDI--2: 3 states • State developed tools: 3 states • AEPS: 2 states • 3 states using on-line assessment systems with the capacity to report OSEP data reports • 5 states using other unique approaches- Maryland is one of those states!

  19. How will MD’s Local Programs Measure and Report Status at Exit Data? • Assessment at exit results will be entered on the IFSP form, Section II: Present Levels of Development (PLOD). The form will be revised so that providers can check off if the PLOD are from an entry, interim or exit assessment. • The name of the assessment(s) that were used will be documented on the IFSP form. This will help narrow list and determine variability of data. • The results of the assessments at exit will be entered into the new screens in the IFSP database.

  20. Who/When/What Does Maryland Rate • Entry ratings are needed in all outcome areas even if a child has delays in one or two outcomes, but not in all three outcomes • Entry ratings on each of the three outcomes should be reported for EVERY child who qualifies for services as soon as possible after the initial eligibility assessment is completed • Exit ratings should be reported on each of the three outcomes for every child who had a COSF completed at entry and who has been in the program for at least 6 months

  21. When are Other States Measuring Data? • It is important to know that all states are approaching this question differently. Each state has established their point in time for capturing data. • The definitions of entry and exit on the previous slide are specific to Maryland and provide specific uniform points in time for capturing data. • Maryland will measure children at entry into and exit from of the Maryland Infants and Toddlers system.

  22. Converting Assessment Data to OSEP Outcome Categories

  23. Measuring Progress Based on the Rate of Growth Between Entry and Exit • Maryland is working with an Evaluation and Assessment Consultant to identify a methodology for measuring developmental gains during participation in early intervention • Going by percent of delay isn’t an accurate picture of progress because children who come in at say 25% delayed and leave the program at 25% have still made progress.

  24. Measuring Progress Based on the Rate of Growth Between Entry and Exit • Maryland is working with an Evaluation and Assessment Consultant to identify a methodology for measuring developmental gains during participation in early intervention. • Currently, they have tested child data using an existing index: Intervention Efficacy Index (IEI)

  25. Intervention Efficacy Index • The intervention efficacy index relates changes in child capabilities to time spent in program • It describes individual and group progress in terms of developmental gains within and across domains for each month in an intervention program. (Bagnato & Neisworth)

  26. Intervention Efficacy Index (IEI) IEI = Developmental gain in months Time in intervention in months IEI = Exit Developmental Age – Entry Develop. Age Time in intervention IEI = 34 months – 20 months 12 months IEI = 1.17 Average developmental gain for each month in intervention; an average of 1.17 months of gain for each month or participation in the intervention.

  27. Conclusion of Test • Conclusion: The IEI must be anchored to the child data to be meaningful.

  28. Linking Results to OSEP Categories To make the data meaningful, Maryland had to: • Test the IEI index with real-child data And then: • Determine numerical ranges that would provide the most accurate linkage to OSEP categories

  29. Maryland’s Conversion Formula • While there are typical stages of growth, babies and toddlers develop at different paces, and may develop more quickly in one area than another. • A child’s developmental age (DA) may be lower than the child’s chronological age (CA), but still be considered at age level because children develop typical skills over a range of time. • Maryland selected a cut point of 19% difference between DA and CA to report status at entry data in February 2007 - % of children who entered at age level in each of three outcomes. • To report progress at exit data, we will also be using a 19% allowance for the formulas for each of the progress reporting categories.

  30. The Big Question: Will Maryland’s Approach Work? • Can the currently collected data about present levels of development be used to provide valid information about the 3 functional outcomes?

  31. Answering the Question • In order to effectively answer this question, we need to validate the process by comparing the outcomes data generated from Present Levels of Development (PLOD) with an outcomes judgment derived directly from those who know the child using the COSF (Child Outcomes Summary Form)

  32. Need to Validate the DataCompare Domain Results to Functional Results

  33. How will Maryland Validate the Results? • In December 2006, local ITP’s began completing the Child Outcomes Summary Form (COSF) as soon as possible following initial evaluation and assessment. • Local programs are completing COSF’s at exit for children: • Who were referred since December 2006 • Who received services for at least six months, and • For whom a COSF was completed at entry • COSF results will be entered into the IFSP database.

  34. Child Outcomes—ValidationWHY? Maryland has elected to use the information from domain-specificassessment results to determine the results of functional child outcomes, therefore it is important that the results be validated. In other words, we are validating the following: “Are the responses derived from the electronically-extracted domain data consistent with direct responses from providers about a child’s functioning in the three outcomes?”

  35. Why is There a Need for the Child Outcomes Summary Form? • The Child Summary process utilizes information from multiple sources to arrive at a single rating or score • This process allows data from different tools and sources to be comparable • Different programs will be using different assessment instruments • Outcomes data will need to be aggregated across programs within and across states

  36. Features of the Child Outcomes Summary Form • The summary form allows for a variety of different kinds of information to be brought together with input from families and professionals • It is not an assessment tool • It uses information from assessment tools and observations to get a global sense of how the child is doing at one point in time • Ratings are based on the child’s functioning: • What the child does across settings and situations • Compared with what is expected given the child’s age • The resulting rating reflects a level of functioning that compares the child to typically developing peers. The process also includes a statement of child progress • 7-point rating scale

  37. Another Benefit • Another benefit of the COSF is that it allows us to track progress over time • We do this by comparing the child’s functional status rating at entry and at exit to see how much progress has been made in those three outcome areas

  38. A Decision Making Process • Using this process, the local program gathers together all information on each child. It is not an assessment tool. Instead it is a decision making process: • Used to transform information of many types and from multiple sources into the same three OSEP outcomes • Using informed parent and professional judgment to develop a consensus on a rating for each child • Based on different types of age-referenced tools that can compare child to same age peers • Based on information about child in natural contexts • This information is then summarized into a common scale, using a rating process. The rating provides a way to reduce many different kinds of data to a common score

  39. This Requires Moving from an‘Eclipsed’ View of Children… • Remember how we have been told to view a solar eclipse by making a small hole in a piece of black construction paper in order to view the sun? • Focusing only on progress toward IFSP goals, or on the skills acquired through Part C services to meet those goals, can be like looking at the child through a very small hole or lens of the individual professional or discipline.

  40. To a More Global View of the ChildAcross all Situations and Settings Positive Social Emotional Relationships Acquisition and Use of Knowledge and Skills Positive Social Emotional Relationships AppropriateUse of Behaviors to Meet Needs

  41. A Global View Means: • Looking at child outcomes—what comes out of children’s participation in Part C supports and services—requires each team member to adopt a more global view of how the child is developing relative to both same-age peers and performance of lifelong skills in future situations and settings • That means taking a more panoramic view of the child’s function in the family’s home and community routines and activities with the standard of how same-age peers would perform in those situations and settings.

  42. Using Multiple Measures & Sources in Your Decision Making Process • In addition to an assessment tool, you will want to consider other sources of information including: • Screening information • Family/caregiver conversation and interviews • Observations across a variety of settings • Anecdotal records • The information gathered to document progress toward goals and outcomes on IFSPs can be part of the information related to child outcomes

  43. Family Input • There are several reasons why we want to use information from conversations with the family when considering our ratings for the COSF. The most important reason is the family will naturally have the most information about how their child takes part in everyday routines and activities in their home and usual settings.

  44. Summary Rating (1-7) • Reduces rich information from assessment and observation into a rating to allow a summary of progress across children • Does not provide information for planning for the individual child. Information at the rich, detailed level will be more helpful for intervention planning purposes

  45. COSF 1-7 Rating Metric • The rating provides a way to reduce many different kinds of data to a common rating • The rating is based on a 7-point scale that is anchored to typical functioning • A 6 and 7 Rating represent functioning at the same level as a typically developing peer • 1 to 5 ratings reflect below typical functioning

  46. Definitions of Ratings • Each number between 1 and 7 contains a specific description to guide the decision making process considering the following: • How typical the behavior is in everyday situations • In comparison to expectations for age-matched peers • In terms of conditions or behaviors that interfere with the child’s ability to achieve age-expected behaviors and skills

  47. Completing the Child Outcomes Summary Form • For each of the three outcome areas, you will need to decide the extent to which the child displays behaviors and skills expected for his or her age • Ratings should reflect the child’s level of functioning using whatever assistive technology or special accommodations are present in the child’s typical settings

  48. The Two COSF Questions • To what extent does this child show age- appropriate functioning, across a variety of settings and situations, on this outcome? (Rating: 1-7) • Has the child shown any new skills or behaviors related to [this outcome] since the last outcomes summary? (Yes-No)

  49. Summary Ratings (1-7) • Provide an overall sense of the child’s current functioning in 3 outcomes They do not provide: • Information on the services provided themselves • The family’s satisfaction with services or • An explanation of why the child’s functioning is at that level

More Related