1 / 27

May 6, 2010 Sacramento, CA

May 6, 2010 Sacramento, CA. SELPA Data Committee. Last month, we got a . of an email about CASEMIS. The way they went on, you would think we were a bunch of. and . We knew it would be a and we would get a lot of

mandar
Télécharger la présentation

May 6, 2010 Sacramento, CA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. May 6, 2010Sacramento, CA SELPA Data Committee

  2. Last month, we got a . of anemail about CASEMIS. The way they went on, you would think we were a bunch of

  3. and . We knew it would be a and we would get a lot of if we didn’t get a

  4. - someone who could . the numbers and get us past this Joseph recommended that we get

  5. but they were caught in a scandal at the Club with a couple of .

  6. Steven recommended the . , but after we interviewed them, they turned out to be

  7. Andrew, the that he is, went down to and checked with his

  8. They had of suggestions. One turned out to be a real

  9. She was no She took one look at our problems and made a to the data set.

  10. In no time at all, that got us to a new

  11. Questions for Chris regarding 08-09 APRs: • Graduation:  Want to confirm that this indicator does not include CAHSEE waiver and exempted students • Dropout:  Confirm that this data goes to DOR and not DOS • Statewide Assessments: • What are consequences of not meeting the target on successive years?  Since the data appears to be from blended sources, is there a way to review your actual data? • What should the 4 levels of technical assistance look like and how will you know whether a district is in level 1,2, 3 or 4?  (Same question for the other performance indicators) • Suspension and Expulsion: • Confirm that this data is rolled over from 07-08 and explain why.  Will this always be rolled over from one year prior? • The disproportionality portion of this indicator was not included.  Will it be calculated next year?  Will the calculation for significant disproportionality for this item include three years of disparity data and then the use of the e formula with 8 Standard Errors?  Will this process be similar for the other compliance indicators? • Since this is a compliance indicator, is there an identified process yet for identifying districts that would have monetary sanctions?  Would this be a multi year process?  (Same question for the other compliance indicators? • Least Restrictive Environment: • LCIs- For small rural SELPAs these placements will cause them to fail the separate facilities item yearly due to the ratio of LCI placements to district placements.  Any ideas on how to address this?  And for High School districts that have self-contained CBI programs for their transition students, they also get penalized for “separate facilities” – any idea on how to approach this?   • Preschool Assessments:  • Will there be targets included in this indicator in the future?  • Will this data be separated out from “SELPA wide” and reported by district in the future (same for indicator 12) • Part C to B Transition • Will you continue to use hybrid data and if so, how can the accuracy of data from other agencies be verified?  Some districts felt their data was incorrect.

  12. 618 Data Tables Child Count Personnel Educational Environment Exiting Discipline Assessment Procedural Safeguards Maintenance of Effort and Coordinated Early Intervening Services (New – released on Monday) EDFacts 18 tables used for special education – state, student and district level reports Annual Performance Report (APR) What Do We Report to the Feds?

  13. APR Compliance Indicators Local and State State only

  14. Fourteen Annual Performance Report Indicators (posted on the web by federal requirement) Compliance Determinations (sent to SELPAs and Districts – not posted) Significant Disproportionality (sent to SELPAs and Districts – not posted) Data Based Noncompliance Findings (sent to SELPAs and districts – not posted) What Do We Report to and About You

  15. APR Indicators – Performance Indicators guide monitoring plan for SESR. Compliance Indicators are used for SESR, Special Self Reviews (for 4,9,10), Compliance Determinations (4,9,10,11,12,13) Compliance Determinations are made up of compliance indicators, timely and complete reporting, timely correction of noncompliance and audits. Each area is given a score 1-4 and averaged for an overall determination. Only the overall determination has consequences. Consequences are outlined in federal regulations for each level below meets requirements. A district that does not have a meets requirements determination cannot use the MOE flexibility. Significant Disproportionality. Requires district to use 15% of IDEA grants to address disproportionality issues Data Based Noncompliance Findings. Requires district to complete a corrective action plan – same corrective actions as SESR What Are the Consequences of Data Findings?

  16. Graduation • Want to confirm that this indicator does not include CAHSEE waiver and exempted students.

  17. Dropout • Confirm that this data goes to DOR and not DOS

  18. Statewide Assessments • What are consequences of not meeting the target on successive years?  • Since the data appears to be from blended sources, is there a way to review your actual data? • What should the 4 levels of technical assistance look like and how will you know whether a district is in level 1,2, 3 or 4?  (Same question for the other performance indicators)

  19. Suspension and Expulsion • Confirm that this data is rolled over from 07-08 and explain why.  Will this always be rolled over from one year prior? • The disproportionality portion of this indicator was not included.  • Will it be calculated next year?  • Will the calculation for significant disproportionality for this item include three years of disparity data and then the use of the e formula with 8 Standard Errors?  • Will this process be similar for the other compliance indicators?

  20. Suspension and Expulsion • Since this is a compliance indicator: • is there an identified process yet for identifying districts that would have monetary sanctions?  • Would this be a multi year process?  (Same question for the other compliance indicators?)

  21. Least Restrictive Environment • LCIs- For small rural SELPAs these placements will cause them to fail the separate facilities item yearly due to the ratio of LCI placements to district placements.  • Any ideas on how to address this?  • And for High School districts that have self-contained CBI programs for their transition students, they also get penalized for “separate facilities” – any idea on how to approach this?

  22. Preschool Assessments • Will there be targets included in this indicator in the future?  • Will this data be separated out from “SELPA wide” and reported by district in the future (same for indicator 12)

  23. Part C to B Transition • Will you continue to use hybrid data and if so, how can the accuracy of data from other agencies be verified?  Some districts felt their data was incorrect.

  24. Other Stuff Infant Discretionary Funds CASEMIS Webinar CASEMIS Software CASEMIS Technical Assistance Guide Table D

  25. Hi Chris & Andrew,After reviewing the proposed codes for Post-Secondary Program (D-18) and Post-Secondary Employment (D-19), I have concerns that several coding changes will create the potential for confusion and mis-coding of data.  In some cases, code numbers have been changed; for example, in past years 220 was used for GED Programs and is now used for Vocational/Technical School.  Another example is 900, which used to be Unknown and is now Incarcerated. I am recommending that no changes be made to the prior codes.  Additionally, the old codings allowed for logical groupings.  For example, the 200s referred to educational programs (GED, college, university) and the 300s referred to vocational training programs (technical school, ROP, etc.).  The revised coding does not sort the data into these kind of logical categories. I am also concerned that some SELPAs have already begun to collect this data.  Therefore, it may be useful to provide some guidance to the field about reconciling already collected data with the new codes, if this is possible. 

  26. Questions

More Related