1 / 46

Improving Teacher Quality Grants, Cycle 3: External Evaluation Report

Improving Teacher Quality Grants, Cycle 3: External Evaluation Report. December 8, 2006 University of Missouri-Columbia Evaluation Team. Principal Investigators Sandra Abell Fran Arbaugh James Cole Mark Ehlert John Lannin Rose Marra. Graduate Research Assistants Kristy Halverson

natan
Télécharger la présentation

Improving Teacher Quality Grants, Cycle 3: External Evaluation Report

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Improving Teacher Quality Grants, Cycle 3:External Evaluation Report December 8, 2006 University of Missouri-Columbia Evaluation Team

  2. Principal Investigators Sandra Abell Fran Arbaugh James Cole Mark Ehlert John Lannin Rose Marra Graduate Research Assistants Kristy Halverson Kristen Hutchins Zeynep Kaymaz Michele Lee Dominike Merle Meredith Park Rogers Chia-Yu Wang Evaluation Team

  3. Context of the Evaluation • Improving Teacher Quality grants program, Cycle 3, 2005-2006 • Focus on high-need schools • 9 professional development projects • Science and Mathematics, grades 4-8

  4. Evaluation Model Adapted from Guskey, 2000

  5. Purpose of Evaluation • Formative evaluation • PD environment evaluation • Summative evaluation • Participant reaction • Participant learning—content knowledge and inquiry • Participant use of knowledge • Organization change • Student learning

  6. Methods—Formative • Site visits • Interviews: teachers and staff • Observations • Formative feedback report

  7. Methods—PD Environment • Teacher Participant Data Questionnaire • Site visits • Interviews: teachers and staff • Observations • Surveys to PIs (TeachingPhilosophy Survey and Seven Principles) • PI preliminary report

  8. Methods-Outcomes • Participant reactions • Site visits • Teacher Participant Survey 1 and 2 • Participant learning—content knowledge • Project-specific tests (all 9 projects) • Participant learning—inquiry • Teaching Philosophy Survey • Seven Principles • Participant use of knowledge • Teacher Participant Survey • Interviews • Seven Principles • Implementation Logs

  9. Methods--Outcomes • Organization change • Higher Education Impact Survey • Student learning • Teacher-assessed (3projects) • Teacher Participant Survey • MAP analyses

  10. Participant Summary • 252 participants • 86% female; 81% white • 40% held a masters degree or higher • 76% held their first Bachelor’s degree in a field other than science or math • Represented 76 different Missouri school districts, 6 private schools, and 2 charter schools • Directly impacted 16,747 students in 2005-2006

  11. Assigned Teaching Levels and Subjects of Participants

  12. Change in Teaching Assignment

  13. Teaching Experience

  14. Elem/Middle/Junior High Certification Status

  15. High School Certification Status

  16. PD Hours Completed in Past 3 Yrs

  17. Experience with Web-based PD in Past 3 Years

  18. Percentage of Participants from High-Need Districts

  19. PD Coverage – Schools and Teachers

  20. Results • PD Environment • Participant Reactions • Outcomes • Participant Content Knowledge • Participant Knowledge of Inquiry • Participant Use of Knowledge of Inquiry • Organization Change • Student Learning

  21. PD Environment--Projects by Standards Area

  22. PD Environment—PI Beliefs (n=19) least constructivist response = 1, neutral = 3, most constructivist = 5

  23. Participant Reactions 1-5 scale

  24. Participant Performance on Content Knowledge—Post/Pre Tests Posttest scores presented as a percent of pretest scores.

  25. Participant Change in Inquiry Knowledge *p < .05. **p < .01. ***p < .001

  26. Participant Change in Inquiry Usage *p < .05. **p < .01. ***p < .001

  27. Participant Use of Knowledge Based on PD Components n=116 0-4 scale

  28. Impact on Participant Use of Knowledge (cont)

  29. Organization Change--Impact on Higher Education • Team members from five projects responded to HEI Survey • Establishment of new science courses related to the PD projects • Establishment of new education courses • Redesign of courses to include more inquiry-based labs • New or strengthened collaborations between education and science • Increased grant writing activity on campus

  30. School Level Performance on MAP • Map Index and % Top 2 Levels • Served vs not served schools by High Needs status • Science – 2005-06 compared prior years’ average performance • Math – no historical comparison possible: examined performance levels by group

  31. Overall Impact of PD Projects

  32. Performance Levels in Science

  33. Performance Levels in Science

  34. Changes in MAP Science Performance – Index Scores

  35. Changes in MAP Science Performance -- Proficiency

  36. Average MAP Index Scores by Grade Level

  37. Average MAP Math Proficiency by Grade Level

  38. Summary of Results • Teachers were overall satisfied with PD experiences • Valued most: staff, engaging in activities as students would, opportunity to improve content knowledge, working with other teachers; • Valued least: lectures,activities geared toward a different grade level or subject matter than they taught, loosely structured follow-up sessions with no clear purpose.

  39. Summary of Results (cont) • Assessment components less emphasized than content and inquiry components. • Teachers gained content knowledge • Evidence of some improved teacher practice attributed to projects. • Student learning data mixed. • Evidence of impact on higher education is limited but promising in some projects.

  40. Conclusions: Effective Project Design Features • Projects demonstrated effective practice to varying degrees. • Alignment of content emphasis areas between projects and teacher/school needs is critical. • Shared vision/collaboration with team implemented in a variety of ways. • Effective emphasis areas: learning science/math through inquiry; collegial learning with teachers; long-term PD activities; sense of community.

  41. Conclusions (cont.) • The “smorgasbord” approach – while well intentioned seemed difficult to carry out. • Emphasis on mathematics in overall cycle 3 ITQG program was somewhat limited. • Individual projects improve over time. • Evaluator role balance between program and projects continues to be an issue.

  42. Limitations • Necessity of sampling. • Instruments align with overall program not specific projects. • Low overall response rates • Implementation Logs • End of Project instruments • Higher education impact • Overall evaluation vs. project specific. • Lack of and alignment of student achievement data. • Impact on evaluation due to ongoing team collaboration with PIs and K-12 partners.

  43. Recommendations Project Directors: • Continue to build strong relations among PIs and instructional staff. • Build stronger K-12 partnerships. • Balance content and pedagogy. • Emphasize and provide opportunities for practice and feedback on classroom assessment. • Encourage participation in evaluation activities. • Take advantage of formative feedback. • Use literature on best practice when designing and implementing PD.

  44. Recommendations External Evaluators: • Explore ways to reduce participant time on evaluation. • Be proactive in working with PIs and K-12 organizations. • Continue to work with PIs through all phases of evaluation. • Work with MDHE to examine our roles as evaluators.

  45. Recommendations MDHE: • Continue funding multi-year projects. • Encourage true partnerships via RFP wording and reward systems. • Require that the majority of participants are from high-needs districts. • Require minimum hours of PD per project. • Support PI cross-fertilization of best practices.

  46. Questions Copies of the report and Executive Summary available at: www.pdeval.missouri.edu

More Related