1 / 89

Standard 6 Pilot of the Analysis of Student Work (ASW) Process

May 28, 2013. Standard 6 Pilot of the Analysis of Student Work (ASW) Process. NCDPI Reflection & Planning Retreat. Welcome & Overview. NC Student Growth Portfolio Training wikipage. Comfort & Considerations. Restrooms & Breaks Wireless Network Electronic Device

nusa
Télécharger la présentation

Standard 6 Pilot of the Analysis of Student Work (ASW) Process

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. May 28, 2013 Standard 6 Pilot of the Analysis of Student Work (ASW) Process NCDPI Reflection & Planning Retreat

  2. Welcome & Overview

  3. NC Student Growth Portfolio Training wikipage

  4. Comfort & Considerations • Restrooms & Breaks • Wireless Network • Electronic Device • Power Strips & Extension Cords • NCDPI Retreat Planning Dochttp://bit.ly/NCDPIRetreatFor going green . . .

  5. Agenda for May 28th • Welcome & Overview • Reflection on Standard 6 Pilot Feedback • Planning Groups • Form & Prioritize Year 1 Implementation Needs • Conduct Work Sessions • Debrief & Next Steps

  6. NCDPI Retreat Objectives • Reflect on the Standard 6 Pilot Feedback • Listen carefully • Discuss critically • Think creatively • Help plan and create professional development materials for Year 1 implementation: July 1, 2013 – June 30, 2014

  7. Introductions With the group, please share: • Name & Title(s) • Division • A decision, project, idea, etc. that you got a “green light” on this year (in 12 words or less)

  8. Overview of Spring 2013 Pilot • 100+ educators with 30-35 in each content area • 2/3 Submitters & 1/3 Evaluators • Over 80% of piloters . . . • Submitted work samples • Attended the webinar • Completed the evaluation survey (60 Submitters & 27 Evaluators)

  9. Pilot Progress & Planning Group → Analysis Teams

  10. Going Green . . . from a mind map

  11. Going Green – Jigsaw Approach Each Analysis Team will present a summary of the feedback on their topic from the pilot, with recommendations. Individually, we will consider the information just presented and brainstorm ways to Go Green.

  12. Going Green – Jigsaw Approach After the last presentation and reflection, we will take a break and then use the NCDPI Retreat Planning Doc (http://bit.ly/NCDPIRetreat) for Going Green. As we work, please join a planning group and start prioritizing for Year 1.

  13. Submission Process Submission Survey Questions 1-7

  14. Feedback Analysis What we can learn from Venture Labs

  15. Matrix What will be Submitted and Evaluated

  16. Going Green – Jigsaw Approach Individually, consider the feedback and recommendations just presented and brainstorm ways to Go Green with the Analysis of Student Work (ASW) Submission Process

  17. Evaluator Survey and Descriptive Feedback Evaluator Survey (1-9) Descriptive Feedback (Rounds 1 and 2) 27 Respondents for survey • Team members: • Christie Lynch Ebert (Lead) • Mike Martin • Nadine McBride • Tara Patterson

  18. Question 1 (N=27)

  19. Question 2 (74% spent 2-5 hours evaluating submissions)

  20. Question 3 (85% spent 0-3 hours)

  21. Question 4 (70.3% Confident/very confident)

  22. Question 5 • What signs should an Evaluator look for to score a submission as Exceeds Expected Growth? (25 responses) – see narrative • Summary indicators from narrative: • Exemplary/significant growth in relation to CO • Demonstration of exceeding standards • Quality of submission, meaningful tasks, details in submission descriptions • Concordance between: goals/standards; topic/resources/activities; time and results • Comparison to unpacking documents • Connection from teacher to student’s growth very clear (extraordinary and beyond meeting expectation of CO – can not be quantified in data alone) • Complete analysis of how teaching strategies impacted student growth (rubric or checklist) • Evidence of above and beyond scope of what was taught in lesson

  23. Question 5 • What signs should an Evaluator look for to score a submission as Exceeds Expected Growth? (25 responses) – see narrative • Summary indicators from narrative: (continued) • National board quality descriptions in writing and significant evidence in student performance • Aligned instruction and assessment with objective leading to student surpassing normal expectations • Analysis of: pre-assessment and gaps; teaching strategies and articulation of purposeful use; how teaching strategies impacted student growth (not just end product as evidence without meaningful, authentic analysis) • Depends on CO – ex. Move from Novice Mid to Novice High – not just wtithin the same proficiency level • Depth to evaluation • Goes beyond the norm according to the standard. • Alignment: all required info., especially narrative included; ID of Clarifying Objective • Targeted objective was exceeded and thoroughly explained and demonstrated with and without words.

  24. Question 5 • What signs should an Evaluator look for to score a submission as Exceeds Expected Growth? (25 responses) – see narrative • Summary indicators from narrative: (continued) • Videos, pictures, student work, self-assessments, teacher rubrics pre and post • Comparison between more than one student - how students were pushed to grow; how the middle of the class grew, and how the low student grew. Look at more than 2-3 students for credibility. Would be nice to see a whole class rubric score than pick out 3-4 students in the class to highlight and show work. • Look for growth over time and compare to context information and work samples; view the submission holistically • Look at every aspect of evidence 1 and 2, how excellent they are and how clearly the growth can be determined

  25. Question 6 (85% 0-10 hours)

  26. Question 7

  27. Question 7 (What issues did you have with the submissions you evaluated?) • Narrative responses: • Examples did not adequately address CO • Too many log-in issues, could not get to submissions • Everything needs to be on one page – need to be able to see as reading descriptions, flipping back and forth was frustrating • Student sample had incorrect answers of information as a result of teacher providing incorrect instruction • Too many Cos identified to focus on growth • Able to see growth but lack of meaningful analysis of how teacher influenced student’s growth • Unclear measurement devices • Submitter not skilled in writing about evidence • Some CO’s not developmentally appropriate • Submitter only submitted one evidence which makes it hard to evaluate

  28. Question 8: guidelines • What additional information or materials would have helped you better evaluate the submissions? (23 responses – see narrative) • Guidelines • Complete background information on duration of course, minutes or hours per day in course, number of days of instruction per week, hours completed prior to collecting pre and post evidence • Concise instructions on what information and how to submit • Standard form of questions • Running monologue is not conducive to evaluation • Suggest specific questions asked of each teacher and then space for additional info. as needed

  29. Question 8: guidelines • What additional information or materials would have helped you better evaluate the submissions? (23 responses – see narrative) • Guidelines • Training on what to focus on and a checklist for feedback – writing anything down in feedback allows too much leeway for untrained teachers/evaluators • More segments of student submissions (beginning, mid-point, end) • Information asked for is more than sufficient if submitter provides enough data and detail; teachers should not expect to meet expectations based on narrative alone • Consistent tools to measure every student • Too vague, too many CO’s – helpful to have continuity of CO’s and proficiency levels

  30. Question 8: guidelines • What additional information or materials would have helped you better evaluate the submissions? (23 responses – see narrative) • Guidelines • Rubric with examples that meet, exceed, or do not meet expectations • Checklist of things to look for when evaluating • Clear criteria for submitters – work samples not enough – how did activities lead to student growth? Articulation of learning styles and complete checklist or rubric for post assessment (missing from samples) • Clearly defined parameters about what “exceed” would look like; more guidelines for “not met” • Printable guide on evaluations • Content-specific points related to scoring and agreement on how to score

  31. Question 8: format • What additional information or materials would have helped you better evaluate the submissions? (23 responses – see narrative) • Format • Samples need common format (jpg, doc, pdf, etc.) • Consistency in acceptable formats (e.g. length of submission and file formats) • Need video to see growth in dance (not pictures) • Easier format to look up submissions and evidence • All materials in one location • Samples labeled correctly • Samples need to be viewed side by side to judge growth • Accessibility to samples (blocked YouTube site) • Easier format

  32. Question 9 • Now that you have completed the pilot, what would you do differently to improve the evaluation process? (23 responses – see narrative) • Specific training for evaluators • Clear, concise instructions on what constitutes evidence; how in-depth should evidence be? • What duration of time is a fair measuring stick for a WL (Arts Ed, HL) teacher? • Can evidence of student growth include products with teacher feedback? • Software – access easier, linear and step-by-step • Process: clear examples needed throughout process • Size and validity of evidence in portfolio – even in worst case, isn’t it possible to show 3 or so students have grown?

  33. Question 9 (continued) • Now that you have completed the pilot, what would you do differently to improve the evaluation process? (23 responses – see narrative) • Who will train principals/administrators? • Improve platform, make process anonymous, provide evaluators training; provide specific feedback items • Utilize trained reviewers for success • Reviewers must be content specific (e.g. you can not ask a choral music teacher to review a band portfolio) • I am very concerned about K-8 teachers with heavy loads (700-900 students weekly) – How do we adjust for inequities across the state – student contact time, resources, etc?

  34. Question 9 (continued) • Now that you have completed the pilot, what would you do differently to improve the evaluation process? (23 responses – see narrative) • Platform should have been evaluated prior to this pilot – very frustrating and upsetting; improve user-friendliness • It is all very subjective • Assign CO’s at the beginning of the year and mandate that all submissions for that level class focus on these CO’s – teacher could pick students and work samples for pre-determined areas. • Guidelines more in-depth; nice to have specific or several different CO’s to choose from • Process needs to be clearer and subject-specific

  35. Descriptive feedback (roundS 1 and 2) • Document created and posted on google site with Scoring guide and descriptive feedback • We have received numerous inquiries about piloters desiring feedback on their submissions so one action should be to determine how/what might be shared • Based on analysis and conversation with team, it is recommended that we either have: (see examples on following slides) • 1. Optional descriptive feedback • 2. No descriptive feedback (this was advised from legal counsel in TN) • 3. Pull-down menu of descriptive feedback or rationale (may need to continue to be built over time) – current data does not support generalized options so will need continued work in this area • 4. Slider scale with criteria listed for rating.

More Related