1 / 19

EFFECTIVE USE OF MULTIPLE MEASURES TO IMPROVE TEACHING EFFECTIVENESS

EFFECTIVE USE OF MULTIPLE MEASURES TO IMPROVE TEACHING EFFECTIVENESS. Jeff Watson Center for Data Quality and Systems Innovation UW Madison. SUMMARY. Three aspects of designing multiple measure systems Case study: Using Data to Improve Teacher Preparation Core challenges. MY BACKGROUND.

Télécharger la présentation

EFFECTIVE USE OF MULTIPLE MEASURES TO IMPROVE TEACHING EFFECTIVENESS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EFFECTIVE USE OF MULTIPLE MEASURES TO IMPROVE TEACHING EFFECTIVENESS Jeff Watson Center for Data Quality and Systems Innovation UW Madison

  2. SUMMARY • Three aspects of designing multiple measure systems • Case study: Using Data to Improve Teacher Preparation • Core challenges

  3. MY BACKGROUND • Systems Engineering • Build systems to support quality and improvement programming • Decision Support Systems, Data Quality, Project Design and Leadership

  4. DEFINE, DESIGN, AND REFINE • Know your goal • What are you trying to do? • Design for that purpose • Methods, processes, models aligned to goals • Manage system quality • Build capacity to improve and maintain • Generate professional knowledge • Inform practice • Measure quality

  5. CASE STUDY: TEACHER PREPARATION • Archibald Bush Foundation’s Teacher Effectiveness Initiative • Improve teacher preparation • Improve how IHEs support their graduates • Inform IHE program improvement

  6. INITIATIVE DESIGN • Building PK-20 partnerships to improve teacher preparation • Recruit • Train • Place • Support • 14 Universities and Colleges; 3 states; approx. 50 LEAs • Data focus • MOUs to build trust • IHEs’ guarantee

  7. TEI: VALUE-ADDED DATA REQUIREMENTS

  8. BUT WHAT ABOUT… • Sense-making of Value-Added Data • How do you improve? • What are the root causes? • How do we support our graduates best? • How do we support our faculty and staff? … IHEs / K12s need more data

  9. ADDITIONAL MEASURES FOR PROGRAM IMPROVEMENT

  10. DATA EXAMPLES • Classroom VA rolled up to the IHE level • Sliced by type of certification, program, cohort, etc… • Benchmarked against statewide average • State assessments, Computer adaptive assessment • Process Data on both IHE and K12 sides • Pre-service curriculum • Student teaching models (e.g., co-teaching) • Residency programming • Pre-service demographics • Labor supply / demand data

  11. DATA EXAMPLES (CONT.) • Survey data • IHE entrance / exit surveys • Employer surveys • Transition to teaching survey (after 1 year into placement) • Observational data • Teacher Performance Assessment (TPA) • IHE developed observational rubric • K12 Educator Evaluation data?

  12. SOME CONDITIONS FOR SUCCESS • Shared Goals • Partnerships • Trust • Information LEA LEA IHE LEA LEA

  13. PROGRAM VALUES / BELIEFS IHEs developed social contracts with each other and their K12 LEA partners. • Teacher Preparation is a PK-16 endeavor • Partnerships are required for improvement, reciprocal benefits • Supporting teachers is core goal • Decision-making benefits from data • No one type of data says it all… LEA LEA IHE LEA LEA

  14. CHALLENGE #1: ACHIEVING PROFESSIONALISM Teachers should be active agents of their own profession and able to make the decisions needed to shape and improve their efficacy. • Is the system we’re building going to support professionalism? • How will this set of measures help teachers achieve mastery? • Are we measuring what we should be measuring?

  15. CHALLENGE #2: RAPID & MASSIVE POLICY SHIFTS Even forward thinking projects like the TEI have been awash in waves of policy changes that may or may not align with the project goals and work. • Shifts toward unplanned uses • Policies outpacing technical and organizational capacity • How do we preserve local voice and autonomy?

  16. CHALLENGE #3: BUILDING FOR ACTION We expect that data will help people make better decisions, ask better questions, and better understand what their work entails. • Summative vs. Formative • Long-cycle vs. Short-cycle • Supporting sense-making • What is the planned action? How do we expect people to respond to our measures? How will that work be supported, facilitated, and managed?

  17. CHALLENGE #4: DEVELOPING GOOD MEASURES Measures have to be cultivated…good feedback doesn’t just happen. • Measures should be unbiased (ie., fair): • Metric design / methodology • Data quality / fidelity • Robust over time, space, content, grade level • Local assessments vs. standardized assessments • What are the predictive relationships between measures? • Are all of our measures available for all of our stakeholders?

  18. CHALLENGE #5: CAPACITY AND STABILITY Systems have to be built and maintained. Resources, vision, leadership are key conditions to success. • How do we build systems across leadership changes? • How does our work change when leadership lacks the capacity and/or will to support us?

  19. ADDITIONAL INFORMATION • Bush Foundation’s Teacher Effectiveness Initiative (TEI) http://www.bushfoundation.org/education/network-excellence-teaching-next • Center for Data Quality and Systems Innovation (CDQSI) http://dataquality.wceruw.org/ Contact Information: Jeff Watson jgwatson@wisc.edu

More Related