1 / 64

Your CIP Play Book- 1 st Quarter

Your CIP Play Book- 1 st Quarter. A Guide and Resource to Writing the 2007-2009 Continuous Improvement Plan Data Analysis Presented and supported by the School Improvement Team of Southern Oregon ESD 2007 .

godfrey
Télécharger la présentation

Your CIP Play Book- 1 st Quarter

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Your CIP Play Book-1st Quarter A Guide and Resource to Writing the 2007-2009 Continuous Improvement Plan Data Analysis Presented and supported by the School Improvement Team of Southern Oregon ESD 2007

  2. To Score Academic TouchdownsWith Your District’s Continuous Improvement Plan:Self-Evaluation Summary Practical Tips

  3. Goals • Feel excited about possibilities for using data to improve student success • Learn to use data to identify a focus for improvement • Experience collaborative inquiry in the use of data • Engage in data-driven dialogue • Gain insight into what school leaders can do to connect data to results • Combine an effective school improvement process with the compliance requirements of the CIP

  4. Powerful Words “Research has found that faculty in successful schools always question existing instructional practice and do not blame lack of student achievement on external causes.” — Carl Glickman, 2002

  5. Continuous Improvement Means: Collective Inquiry-school teams constructing meaning of student learning problems and testing out solutions together through rigorous use of data and reflective dialogue – unleashes the resourcefulness of educators to solve the biggest problems schools’ face.

  6. Virtually All Education ResearchersAgree: Collaboration is Key • Deborah Ball Dan Lorti • Roland Barth Robert Marzano • Carol Belcher Jay McTighe • Louis Castenell Milbury McLaughlin • Jim Collins Fred Newmann • Tom Corcoran Allan Odden • Linda Darling-Hammond Doug Reeves • Lisa Delpit Mike Schmoker • Rick DuFour Dennis Sparks • Karen Eastwood James Stigler • Richard Elmore Gary Wehlage • Susan Fuhrman Grant Wiggins • Carl Glickman Jackie Jordan Irvine • Asa Hilliard Deborah Shifter • Anne Lieberman and more…

  7. Warm-upConsensogram:What? A consensogram is a visual display of data generated by a group that provides an efficient way of gathering and displaying information that can be used immediately for processing. Using Data/Getting Results, Nancy Love Data-Driven Dialogue, Bruce Wellman and Laura Lipton

  8. Consensogram:Why? • Icebreaker • Learn a tool to use in district/school. • Demonstrate 3 phase process for learning from any kind of data. • Illustrate principles for data use.

  9. Consensogram Directions • Complete the survey • Take one post-it for each question-Note colors/#s on wall • Write response on post-it • Set aside for now

  10. PHASE 2 Observe PHASE 3 Infer/Question GoVisual Data-Driven Dialogue PHASE 1 Predict Adapted from Organizing Data-Driven Dialogue by Laura Lipton Bruce Wellman, MiraVia LLC, 2001

  11. Phase 1 • Predict • Surfacing experiences, • possibilities,expectations • With what assumptions are we entering? • What are some predictions we are making? • What are some questions we are asking? • What are some possibilities for learning • that this experience presents us? • Phase 2 • Observe • Analyzing the data • What important points seem to “pop out?” • What are some patterns or trends • that are emerging? • What seems to be surprising • or unexpected? • What are some things we have • not explored? • Phase 3 • Infer/Question • Generating possible explanations • What inferences and explanations can we draw? • What questions are we asking? • What additional data might we explore • to verify our explanations? • What tentative conclusions • might we draw? Adapted from workshop material by Laura Lipton and Bruce Wellman, MiraViaLLC (permission pending). Using Data Facilitator Toolkit - draft 7/18/03 ©TERC 2003 Data-Driven Dialogue

  12. PHASE 1 Predict PHASE 2 Observe PHASE 3 Infer/Question GoVisual Phase 1: Predict Starters I predict… I assume… I wonder… I’m expecting to see…

  13. Go Visual • Large, visually vibrant displays of data • Color coding Consensogram: Place post-its on corresponding wall chart to develop a bar graph

  14. Aggregate Trends Disaggregated Cluster Item A B C D B Observations Inferences/Questions Building Your Data Wall

  15. PHASE 1 Predict PHASE 2 Observe PHASE 3 Infer/Question GoVisual Data-Driven DialoguePhase 2: Observe • What important points seem to pop out? • What patterns or trends are emerging? • What is surprising, unexpected? • What questions do we have now? • How can we find out? Adapted from Lipton and Wellman, 1999

  16. BECAUSE

  17. Concept Attainment, Part 1 YES NO • It’s 53 degrees out • It’s cold • 75% of our 4th graders scored below proficiency in mathematics problem solving • Our teachers are not comfortable with mathematics content • This student diagrammed each trip across the river • The student must have used the diagram to generate the rule

  18. PHASE 1 Predict PHASE 2 Observe PHASE 3 Infer/Question GoVisual Phase 2: Observe Starters I notice that… I see that…

  19. PHASE 1 Predict PHASE 2 Observe PHASE 3 Infer/Question GoVisual Data Driven Dialogue: Phase 3, Infer • What inferences and explanations might we draw? • Record your possible explanations and questions. • What further data needs to be investigated? • What implications for student learning does this have?

  20. Shifts that Are Moving Schools from Resignation to Improvement Less Emphasis More Emphasis Internal and collective responsibility, equity Culture External accountability Data to serve, opportunities for all Data to sort, opportunities for some Instructional improvement Feedback for continuous improvement, frequent and in depth use Carrot and stick, avoidance Data Use Top-down, premature data-driven decision making Ongoing data-driven dialogue and collaborative inquiry Collaboration Individual charismatic leaders as change agents Learning communities with many change agents Leadership & Capacity

  21. PHASE 2 Observe PHASE 3 Infer/Question GoVisual Data-Driven Dialogue PHASE 1 Predict This is the process we will use the rest of the day to analyze CIP Data. Consider if it will work to take back to your school sites for drilling down the data Adapted from Organizing Data-Driven Dialogue by Laura Lipton Bruce Wellman, MiraVia LLC, 2001

  22. Are You Ready For Some Football?

  23. But first! Revisit our last session Key learnings from Dr. Robert Barr related to The Kids Left Behind

  24. All Students Must Achieve High Academic Performance • Or live out their life unemployed, underemployed, or unemployable • Education…the only door of Opportunity • Education…the ultimate Civil Right

  25. Schools fail poor/minority students when they: • Hold low expectations for achievement • Assign students to inexperienced teachers • Fail to teach reading / basic skills • Retain, track, pull-out and miss-assign to special education • Blame students families • Employ a “Bell Curve” mentality • Inequities in school funding

  26. Common Characteristics of High Performing Schools • Extensive use of state/local standards to design curriculum and instruction, assess student work and to evaluate teachers; • Increased instruction time for reading and mathematics; • Substantial investment in professional development for teachers focused on instructional practices to help students meet academic standards; • Comprehensive systems to monitor individual student performance and to provide help to struggling students before they fall behind; • Parental involvement in efforts to get students to meet standards; • State or district accountability systems with real consequences for adults in schools, and; • Use of assessments to help guide instruction and resources, and as a healthy part of everyday teaching and learning. The Educational Trust, Inc., 2001

  27. Catching Up the Kids Left Behind • Ensure effective district and school leadership • Understand poverty / hold high expectations for all students • Target low-performing students / schools…start with reading • Align / manage / monitor the curriculum • Create a culture of data and assessment literacy • Institute instructional improvements / capacity • Reorganize time & space • Engage parents,community and schools • Support effective teaching Teaching the Children of Poverty: Catching Up the Kids Left Behind Robert Barr & W.H. Parrett, 2005

  28. Now, Are You Ready For Some Football?

  29. Phases of Writing the CIP(and winning the game) • Pre-Game Prep • 1st Quarter • 2nd Quarter • 3rd Quarter • 4th Quarter • Post-Game Celebration and Analysis

  30. Tasks Completed During the Pre-Game • Determine your coaches • Recruit team-assign positions • Review last year’s stats (CIP) • Create a game plan

  31. Let’s Review the Rules of the Game • ODE Powerpoint on the Compliance Requirements of the C.I.P. (Continuous Improvement Plan) The CIP is a way to report your district’s effective progress toward systematic increases in student success and comply withvarious ORS’ and Federal requirements

  32. THE CONTINUOUS IMPROVEMENT PLANING CYCLE • PLANNING LOGISTICS • Select a planning team • Establish a planning process • IMPLEMENTATION AND MONITORING • Monitor implementation • Evaluate implementation and impact • Revise plan • SELF-EVALUATION • Assess district performance • Assess district practices • Identify achievements and priority concerns CONTINUOUS IMPROVEMENT FOR STUDENT SUCCESS • STRATEGIC DECISION-MAKING • Identify improvement goals and strategies • Develop an action plan • Allocate resources • Write the plan Review the rules of the game Continuous Improvement Cycle

  33. 1st Quarter Begins! SELF-EVALUATION • Assess district/school performance • Assess district/school practices • Identify achievements and priority concerns Did you score a first down toward achieving student success?

  34. What does a winning team do? Criteria from CIP Review Guide • Conclusions are made based on a deep analysis of relevant and current qualitative and quantitative data that connects to the priority, goal or task. • A system is in place where all district staff analyze data to inform decisions at all levels and analysis is shared with all stakeholders. • Analysis of disaggregated data for diverse populations is presented to school staff and stakeholders and used at both school and district levels in planning for improving student achievement. • Priority concerns are integrated with the Oregon Educational Performance Standards and Standards for District Success Refer to the goldCIP Review Guide

  35. Prompt #1 and Data Analysis Prompt 1: Describe the district’s progress against each of the 10 Oregon Education Performance Standards relative to the conclusions drawn from the data analysis in the 2005 submission and against the goals stated in the 2005 action plan. Provide a description of any additional state and local data used in the Self-Evaluation.

  36. Data Driven Dialogue • Write your previous CIP goals on chart paper • Review data used in previous CIP (2003-2005)-create linear chart w/paper • Predict what the current data will indicate (2005-2007) • Go Visual:create data wall to compare and contrast-create linear chart with paper • Observe the data • Infer/Question/Implications for student learning success Use Oregon Performance Standards Worksheet

  37. Pause and reflect • What student learning problems seem to be emerging? • What strategy or process might be transferred to work within your district? • Did you find you had to remind yourselves of the step in the process you were in? What did that look like?

  38. Prompt 2 and Data analysis Prompt 2: Describe the district’s strategies and programs that enabled your district to make progress against the 10 Oregon Education Performance Standards.

  39. Cheerlead your team! On your data wall, indicate success with a bright stikee note and a star. Ask every school to document what programs or strategies enabled this success. This is homework! Share the successes

  40. Prompt 3 and Data Analysis Looking at how we can improve our performance to win the game Prompt 3: Discuss the reasons (causes that contributed to any Expectations that were not realized in relation to the 2005-2007 goals. Use the Standards for District Success to guide this discussion.

  41. Identifying Causes • Recognize areas where goals were not achieved. • Where are the achievement gaps? State learning problem w/detail. • Analyze the causes using the Fish Diagram and the Six Standards for District Success • Review implementation of action plan. Was it monitored regularly? • Could this have been a flawed action for the learning problem? • What additional data would help clarify here? • Summarize the 2007 priority concerns and their causes/contributing factors • No shame, no blame, no excuses

  42. 1 Data Source + 2 Data Sources =Triangulation Performance Assessments SEC Alignment Data Student Learning Problem + = Common Assessments OSAT/TESA AP Tests Classroom Multiple Choice Performance Task Open Ended Interview

  43. Triangulate: What? Why? How? • Triangulation is a process of using three or more sources of data to “build confidence in the accuracy of particular data.” * • The goal is to determine if nearly all persons believe that the data accurately measured what they said they had measured. • Triangulation provides a picture of students’ understanding through multiple measures • Can combine quantitative and qualitative data –data from three sources, data collected by three methods or three different kinds of data. Carr & Artman, 2002

  44. Student Learning Data Findings 1 2 3 Type of Data Aggregate / Summary Reports Disaggregated Results Cluster / Strand /Subscale Item Analysis Student Work Student Learning Problem Identify Student Learning Problem

  45. This is the type of further analysis that might be part of Prompt #4

  46. What we learn from triangulation… • Prepares the Data Team to formulate a learner-centered problem. • The task of the Data Team is to use the findings identified through triangulation to articulate a clear student learning problem. • The Data Team wants to make sure it is focusing on the right problems to solve and choosing areas likely to have a positive impact on student learning.

  47. Cautions for Triangulation • Need for rich data sources that include student work • Need for data sources that measure similar content; e.g., cluster data may not be the same on all assessments • Is data norm referenced or criterion referenced? How can one support the other? • Is the learning problem supported in the other data? How? • Does the data deepen our understanding of student learning problems? • Look at data over time • Disaggregate when you can

  48. : Student Learning Problem: Statement Criteria • Who? • Grade level of students • Percent of students • Achievement gap among students (if applicable) • What? • Performance level • Subject • Strand/concept/skill • Based on what evidence? • Types of assessments • Dates of assessments

  49. Student Learning Problem Template (Grade) students at (school/district) are (performance level) in (subject area). Weak areas are (skills, concepts) as evidenced by these data: {List all evidence in this manner} • (%) of students are (performance level) on (date and name of assessment) These performance gaps were noted: • (%) of (race, class, ethnicity, gender, special/regular) students are (performance level), while (%) of (race, class, ethnicity, gender, special/regular) students are (performance level) as evidenced by (dates and names of assessments)

  50. Student Learning Problem Statement: Example Fourth-grade students at Lincoln School are below proficiency in mathematics. Weak areas are mathematical communication and reasoning as evidenced by these data: • 64% of students are below proficiency on the 2004 Nebraska State Assessment • 52% of students are below basic on the 2004 district benchmark assessment • 37% of students scored a 1 on the 2004 school common assessment These performance gaps were noted: • 49% of African American students are below proficiency, while 33% of white students are below proficiency as evidenced by the 2004 Nebraska State Assessment

More Related