1 / 48

Value -Added Assessment: One Star in the Constellation of Organizational Development and Transformation

Value -Added Assessment: One Star in the Constellation of Organizational Development and Transformation . Dr. Jim Lloyd Assistant Superintendent Olmsted Falls City Schools. Advanced Organizers. Olmsted Falls is a SOAR District Olmsted Falls will become part of BFK’s T-CAP

tyler
Télécharger la présentation

Value -Added Assessment: One Star in the Constellation of Organizational Development and Transformation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Value -Added Assessment: One Star in the Constellation of Organizational Development and Transformation Dr. Jim Lloyd Assistant Superintendent Olmsted Falls City Schools

  2. Advanced Organizers Olmsted Falls is a SOAR District Olmsted Falls will become part of BFK’s T-CAP Lloyd (2008)—DVAS reported: The need for further PD related to using data to impact teaching and learning The need to “fit” EVAAS in with other data sets The need to use EVAAS as an improvement tool

  3. Objectives of the Presentation Understand the following points: Value-added data is one very important component to the continuous improvement process. EVAAS is a rear view mirror analysis The story behind the added value is most important Special programs do not lead to increases in student achievement or progress. Changes in adult behaviordo lead to increases in student achievement and progress. Play “small ball” and do not try to hit a grand slam…get teachers to begin to do things differently and share those experiences .

  4. What’s in your folder? Part III of a presentation that I gave to our middle school staff last year…I handed out the exploration questions that were created for the groups. An article from the Principal Navigator Chapter V OFCS Power Walkthrough Template OLAC Leadership Development Framework

  5. What did Sanders & others tell us?

  6. Factors related to student learning – District, School, and Teacher Influence on Student Progress Following inferences were shared at the Governors Education Symposium (2004) Based on 22 years of Value-Added Study, Dr. Sanders draws the following conclusions: Variation in student academic progress can be attributed this way: 5% attributed to District quality 30% attributed to School quality 65% attributed to Teachers quality

  7. Socio-economic status Early educational opportunities Parent’s educational level School Factors Influences on student achievement

  8. Teacher quality • use of formative assessment • clear learning targets • Quality instructional practices • School effects • Clear mission/vision • Goal setting • District effects Influences on student PROGRESS/GROWTH

  9. Things People Will Say about EVAAS • Districts & schools with high achievement scores can’t make gains to demonstrate growth…this model isn’t fair. • This model isn’t reliable and valid…there is discrepant research in the field about it.

  10. How often do students score within the Top 3 Scaled Score Points two years in a Row?

  11. How did the Suburban Districts Do, in particular? The highest percentage of students scoring within the top three scaled scores two years in a row was a little over 2%. Five wealthy Ohio suburban school districts had the following highest (district best) rates of students scoring within the top 3 scaled scores 2 years in a row: District A – 2/172 (1.16%): 8th gd. Reading District B – 7/612 (1.14%): 5th gd. Math District C – 5/266 (1.88%): 4th gd. Math District D – 1/77 (1.30%): 4th gd. Math District E – 1/58 (1.72%): 5th gd. Math These were the highest rates these districts saw for any grade for students repeating top-3 scaled score performances across years within an OAT subject

  12. Organizational Development Through Collaborative Exploration Work of the Ohio Leadership Advisory Council (OLAC) Things You Should Consider Establish a District Leadership Team Establish Building Leadership Teams Work on the work

  13. About exploration… Excellent with Distinction doesn’t mean much when you don’t know exactly why We needed to look at data points in order to see our constellation

  14. The Leadership for Learning Framework (Reeves, 2006)

  15. The Olmsted Falls “Effect” Constellation End of Course Exams SAT/ACT OAT Data Implementation Data Classroom Walkthroughs SOAR Graduation Data CASLData EVAAS Data Perception Data

  16. We’re working on clearly defining the “Cause” constellation now

  17. Our exploration mechanism

  18. Our process Conduct a cause and effect analysis Use an array of data points including both SOAR and ODE value-added information Define a very limited number of goals Our district foci—Get better at 2 things Clarity of Learning Targets Student Feedback

  19. OFCS Goal Stated in measurable terms—By 2011 OFCS will have experienced a 5% increase in proficient students in all buildings in each core subject area when compared to 2008 baseline performance as measured by the OAT and OGT. Specific, Measurable, Attainable, Results, Time bound Increase student proficiency in all buildings in the core…does this mean we should only aim for proficiency…NO! J_Lloyd_2008

  20. How will we accomplish this? Strategy Deconstruct, implement and monitor the most important learning targets by content area into degrees of cognitive complexity in order to more clearly articulate the meaning of them to students.

  21. What evidence do we need to measure our progress? Make the learning targets clearer for students in the core curriculum in grades PreK—12. Create an implementation system to determine whether or not the essential learning targets are clear to students prior to, during and after instruction. Develop a balanced assessment system that emphasizes formative feedback to students during learning and has points of data collection after learning. Provide time and support for teachers to collaborate on student learning

  22. Making the Learning Targets Clearer J_Lloyd_2008

  23. Clarity of Learning Targets

  24. Why clarity? It establishes where the learners are in their learning. It establishes where they are going. It provides them with advanced organizers on how to get there.If we don’t start with clear targets we won’t end with sound assessments.

  25. Start with considering all indicators Identify PIs by content area for each grade level Link PIs to course content and course descriptions Learning targets are written in student and parent friendly language Unwrap learning indicators for the standards in order to identify concepts, skills, Essential Questions & Big Ideas Use a learning taxonomy to identify complexity of learning targets What do we mean by clarity? J_Lloyd_2008

  26. Benefits of Clarity Research indicates students can hit targets they can see Increases opportunities for formative assessment and student feedback Teachers talking about and agreeing on targets makes them clearer to everyone Posting targets in the classroom and talking about them before, during and after instruction makes them more relevant Breaking targets down into complexity makes them clearer to everyone

  27. PD Implications of Clarity ID Power Indicators and actual use them to make the learning targets clearer for students Student friendly learning targets prior to, during and after lessons Big Ideas and Essential Questions prior to, during and after lessons Asking students if the targets are clear Monitor the implementation of our professional development to ensure it is changing instructional practice (classroom walkthroughs)

  28. The Power of Student Feedback

  29. High quality assessment is indistinguishable from high quality instruction

  30. What do we know about classroom assessment? Finding 1: Classroom assessment feedback should provide students with a clear picture of their progress on learning goals and how they might improve. Hattie (1992) & Hattie & Taimperley (2007) Bangert-Drowns, Kulick, Kulick & Morgan (1991) Telling students whether they were correct or incorrect had a negative effect on their learning. Explaining the correct answer and having them refine was associated with gains in learning (20 percentile points).

  31. What do we know about classroom assessment? Finding 1: Classroom assessment feedback should provide students with a clear picture of their progress on learning goals and how they might improve. Fuchs & Fuchs (1986)—analyzed 21 studies Graphic displays of results enhances student learning. Results interpreted by a set of rules (like a rubric) enhanced student achievement by 32 percentile points.

  32. What do we know about classroom assessment? Finding 2: Feedback on classroom assessment should encourage students to improve Kluger & DeNisi (1996) The manner the feedback is communicated greatly affects + or – effect on achievement. When feedback is negative it decreases achievement by 5.5 %ile points.

  33. What do we know about classroom assessment? Marzano (2006) identified 2 characteristics of effective feedback. Feedback must provide students with a way to interpret even low scores in a manner than does not imply failure. Feedback must help students realize that effort on their part results in more learning.

  34. 65% D

  35. What do we know about classroom assessment? Finding 3: Classroom assessment should be formative Black & Wiliam (1998)—analyzed 250 studies Formative assessment done well results in student achievement gains of about 26 percentile points. It has the highest impact on those students who have a history of being low achievers.

  36. Our definition FORMATIVE ASSESSMENT …is a planned process in which assessment-elicitedevidence of students’ status is used by teachers to adjust their ongoing instructional procedures or by students to adjust their current learning tactics. Popham, J (2008). Transformative assessment. Alexandria, VA: ASCD.

  37. What do we know about classroom assessment? Finding 4: Formative classroom assessments should be frequent Bangert-Drowns, Kulik & Kulik (1991)—meta-analysis (29 studies). Frequency of formative classroom assessments is related to student achievement

  38. The Power of Feedback—gains in student achievement For SPED students Cues & corrective feedback Cues, participation, reinforcement & corrective feedback Reducing class size Rewards & punishment Teacher praise 39 percentile points 37 percentile points 27 percentile points 5 percentile points 5 percentile points 4 percentile points

  39. PD Implications of Feedback Establish data/learning teams and structure collaborative time Provide opportunities for teachers to learn and share feedback strategies Have teachers observe each other to see how it occurs Monitor the implementation of our professional development to see if it is changing instructional practice (classroom walkthroughs)

  40. Close Your Knowing-Doing Gap Implement and monitor the things that you’re already doing Provide people with time to reflect on the results

More Related