1 / 52

Planning for the New Measurable Objectives: 2 Critical District Issues with Examples & Background to Help You Add

Planning for the New Measurable Objectives: 2 Critical District Issues with Examples & Background to Help You Address Them. KASB, Topeka, 4 April 2013 Kelly Spurgeon, KSDE Tony Moss, KSDE. What we hope to cover today:. How does the new federal acct. system compare with the old?

fancy
Télécharger la présentation

Planning for the New Measurable Objectives: 2 Critical District Issues with Examples & Background to Help You Add

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Planning for the New Measurable Objectives:2 Critical District Issues with Examples & Background to Help You Address Them KASB, Topeka, 4 April 2013 Kelly Spurgeon, KSDE Tony Moss, KSDE

  2. What we hope to cover today: • How does the new federal acct. system compare with the old? • How can districts best prepare for these changes? • An integrated district Measured Objective strategy; • Communication strategies for main constituency groups • Why do districts need an integrated MO strategy? • Nuts & bolts of the Measured Objectives.

  3. Leaving the AYP Mentality: What we had: What we will have: 4 Measured Objectives from assessments; 5 areas in the new accreditation model, each counting 20% Incentives to move every student to their highest possible proficiency level Constructed response items that will require deeper conceptual mastery One gap measure of the lowest 30%, no duplicate counting Overall, an emphasis on improved instruction and instructional support • An overemphasis on a state assessments and a single measure, percent proficient • Identifying “bubble kids” and getting them over the standards line • Sometimes teaching to identified test items, not concepts • Counting the same kids more than once across multiple subgroups

  4. The big differences: • A much more nuanced set of measures (good schools can have different profiles) • College and Career standards—”raise the bar”—so that students are graduating ready for college or a career. • Why? So they can better compete in a very competitive world.

  5. 2 Critical District-Level Tasks: • Designing an integrated strategy for making the Measured Objectives; and • Developing a communication strategy for each of your 3 key constituencies: • Your staff • Your board and political stakeholders • Parents, the public, and the press.

  6. Why do districts need a Measured Objectives strategy to coordinate schoolstrategies? • Children’s intellectual and social-emotional development is integrated and hierarchical: each stage depends on the stage that went before; • Only districts can put all the institutional segments together, from early childhood education and care, through primary, middle, and high school, and keep them all directed toward the same integrated goals.

  7. For example, consider the holes in children’s academic trajectories when measured only by state assessments: All the other necessary parts—early childhood, student engagement, relationships, school climate and culture, professional development—are missing.

  8. AnOverly Simple Example of a District’s Integrated MO Strategy: • Collaboration with local pediatricians, early education and child care providers, and social services to improve child development and child-rearing skills before Kindergarten; • In primary school A, a focus on immediate teacher recognition of any student’s failure to master any competence, including behaviors, and immediate one-on-one work to move the child to competence; • In middle school B, a focus on weekly career and academic coaching, Individual Plans of Study, and career shadowing; • In high school C, a focus on improved social climate, social inclusion, continued Individual Plans of Study, project-based instruction, pathway completion, and higher API scores. • District investments in high-quality, focused, 50-hour PD training. Ages 0 – 5 5 to 9 10 to 14 15 to 18

  9. Let’s now consider districts’ communication needs: A common question you can expect: Why all the changes? • To better prepare students for college or a career, the bar is being raised. • AYP was obsolete and had to be replaced.

  10. In 2015, with the higher standards in the new Kansas Career and College Ready assessments, we can expect for scores to go down substantially.We have to explain this so it isn’t mis-interpreted. New assessment introduced

  11. Points that Might Be Included in a Staff Communication Plan: We are in a period of rapid change in assessments and accountability: • ESEA Waiver replacing AYP • Career pathways assessments coming • School readiness measures coming • A broader set of accreditation measures (state assessments results only 20%) (rigor, relevance, responsive culture, relationships 20% each) • KEEP (Kansas Educator Evaluation Project) • Blended assessment in 2014 & Kansas College and Career Ready assessments in 2015

  12. Staff Communication (cont.) • The Waiver’s incentives are different from AYP’s—they reward moving every student to the highest proficiency possible; • Constructed response items will require greater student mastery of concepts; specific test items won’t be identified. • Constructed response items will have to be checked by staff. Educators will have a view of student proficiency across the grades when they check these items. • Many MO’s will be re-set in 2015 & 2016 because the assessments will be new.

  13. Points that Might Be Included in a Board or Parent Communication Plan: • The new assessments are raising the bar, so we should expect that scores will go down. • This decline is a result of the new assessments, not indicative of worse performance by students or teachers. • Assessments are designed so that all students miss some items. Some assessments are more difficult than others. • Different assessments have different plateaus or ceilings.

  14. Example: NAEP Scores For 17-year olds, NAEP reading scores have only varied 5 points in 42 years.

  15. Now the details of the new MOs:

  16. Academic Performance Index

  17. How is the API calculated?

  18. Getting your API data from the Performance by Grade Report:

  19. Let’s compare 2 years of Math API:

  20. Why did Kansas need a new academic performance measure? • Not many students left below proficient. • The API rewards schools for moving all students to higher proficiency. • It is more accurate that percent proficient. • It acknowledges the ceiling or plateau—gets us away from 100% proficient idea. • We could use historic rates of improvement to set realistic MO goals of improvement.

  21. To differentiate between schools, we looked at the distribution of all schools’ API scores. Then we divided the distribution into quarters.

  22. Title (federal) & Non-Title Categories 10% Reward Schools 4 Making Progress, Not Making Progress, Title I Schools 3 2 5% Priority Schools 10% Focus Schools 1

  23. Reading AMOs are like the Standard of Excellence:

  24. Student Growth Percentiles imitate pediatricians’ growth charts. Girls’ Length and Weight by Age Normed, percentile bands from the 5th to 95th

  25. Kansas Growth AMO: a relative measure There are no consequences for not making a building’s growth measure.

  26. Advantages of the Student Growth Percentile Model SGPs map a student’s progress relative to all assessed students SGPs set realistic yearly goals based on each student’s academic peers—students with similar score histories

  27. How will growth MOs be displayed?

  28. the Gap: How are the lowest performing 30 % doing? • How are 270 of my 900 students doing? • Based on performance categories, not individuals • No subgroups • See the Performance by Grade Report • If the LP30 make an API of 500, also made the gap MO

  29. How is the Gap calculated? 900 * 0.30 = 270

  30. Reducing the Non-Proficient

  31. Reduction of Non-Proficient AMOs • The goal: cut the percentage of non-proficient students (RNP) in half by 2017-2018 • Custom RNPs at the district, building, and subgroup levels • Traditional subgroups with an n≥ 30 will have an AMO determination • No monitored students in the SWDs or ELLs • Only the All Students group is merged if less than 30 (this is still under consideration)

  32. Graduation AMOs

  33. Participation AMOs

  34. Resources • Video clips and fact sheets for Achievement, Growth, Reduction of Non-Proficient and Gap AMOs can be found on the ESEA flexibility waiver webpage:http://www.ksde.org/Default.aspx?tabid=5075 • This page can also be accessed by clicking this logo on the KSDE home page

  35. Questions? Contact the Measurable Objectives Help Desk: Phone: (785) 296-2261 Email: mo@ksde.org

  36. Slides that deal with subtopics follow:

  37. The first big hole—early childhood—is where the largest unrealized gains are. The success of the whole educational enterprise is even more dependent on early childhood than was previously known: • Working memory and self controls—the ability to delay gratification and control impulses • Students’ social predilections—empathy and considering others • Students’ behaviors (persistence, engagement, motivation) • Language skills • Some disabling conditions—antisocial behaviors, ADHD— All have their origins in early interactions and environments.

  38. Some Implications for Districts: • Large improvements in K-12 education depend on overcoming fragmented services in early childhood and improvements in the quality of family environments. • The social intelligence and responsiveness of teachers are important--sometimes as important as subject expertise. • An overemphasis on academics and test scores, to the exclusion of developmentally important events, e.g. fantasy play in the early school years, and social integration, can be developmentally damaging.

  39. Another consequence of ignoring early childhood: large inefficiencies are built into later district and post-secondary results: return per $1 invested

  40. Why did Kansas need a new academic performance measure? Relatively few students are available for moving over the proficiency line.

  41. Is the APImore accurate than the Percent Proficient? School T has 91% at Standard or Above, but 75% in the top 2 categories, Exceeds & Exemplary. School P has 92% at Standard or above, but only 46% in the top 2 categories.

  42. We step away from AYP’s 100% above standard and introduce the concept of a ceiling. API Average Building Scores, All Students Group, All Public Schools, 2000 to 2012

  43. What is a realistic rate of improvement? Rates of improvement are larger at the beginning of a new testing cycle. They start to plateau when all of the variables in the system—alignment with standards, student and teacher skills, engagement—begin to reach their limits.

  44. 2 pts. / yr 5 pts. / yr 10 pts. / yr 15 or more 19 pts. / yr

  45. Mathematics AMOs

  46. How can our measures be made more useful? • Please tell us what data and charts would help you communicate better with: • Your board • Your parents • The local press and • Your teachers and staff. • We need your feedback and suggestions • To prompt your thinking, some examples follow

  47. Possible Improvements within Current Assessment Reports: • Population trends • Cohort views of the API & Growth • Grade patterns across years • Adding comparison groups of “schools like mine” or “students like mine” • Bar graphs that show improvements across performance levels

  48. How about growth trends?

  49. Will there be comparative growth measures?

More Related