1 / 30

Assessing the Continued Efficacy of a University Center Model of Teacher Preparation Program

Assessing the Continued Efficacy of a University Center Model of Teacher Preparation Program. Diana Lys Kristen Cuthrell Laura Bilbro -Berry. Purpose of the study. Builds upon

talmai
Télécharger la présentation

Assessing the Continued Efficacy of a University Center Model of Teacher Preparation Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing the Continued Efficacy of aUniversity Center Model of Teacher Preparation Program Diana Lys Kristen Cuthrell Laura Bilbro-Berry

  2. Purpose of the study • Builds upon • Strand 2 – Demonstrating Effectiveness and Relevance by engaging in the process of continuous program improvement. • Prior research which noted the need for more valid and reliable assessments upon which to base program pathway comparisons. • As a new teacher performance assessment – edTPA – is implemented, do candidates in different program pathways continue to have comparable outcomes?

  3. Antecedent Research-University Center Model • Boyd, Lankford, Loeb and Wyckoff (2005)– geography of first employment to hometown and/or college • Lorenzo (2005) – co-location models provide level access • Grady (2005) and Vaughan (2006) – barriers to community college transfer • Locklear (2007) and Locklear et al (2009) – university center models as comparable preparation programs • Bill and Melinda Gates Foundation; With Their Whole Lives Ahead of Them (2009) – reasons for leaving college early; lack of persistence

  4. Antecedent Research:Teacher Performance Assessments • When assessing preservice teachers, it is important to evaluate their knowledge and skills, student learning, professional dispositions, and reflective practices (Darling-Hammond, 2006). • Performance assessments provide documentation of the teachers performance, note progress toward reaching the program goals, and dissect the program’s strengths and (Darling-Hammond, 2006). • Portfolio assessments are one type of performance-based assessment, used for formative, summative, and predictive assessment (Bannink, 2009). • Portfolio assessment may be beneficial for certain aspects of the teaching certification process, such as documentation of planning and examples of instruction, but may not be valid for the assessment of teacher competencies (Yao, Thomas, Nickens, Downing, Burkett, & Lamson, 2008).

  5. Traditional Elementary Education Program Overview • Includes face to face coursework in pedagogy, knowledge, and skills . • Majority of junior and senior level coursework includes practicaexperiences; many supervised by faculty. • Spirally instruction is woven throughout program courses in observational skills, planning for diverse learners, research based instructional strategies, Common Core curriculum, classroom assessment, and differentiation. • Emphasis on classroom management occurs at the senior level. • Candidates participate in a year long internship: Senior 1 (1 day a week) and Senior 2 (5 days a week)

  6. University CenterProgram Overview • Five hub site community colleges, each with several spoke site CCs creating regional consortia • Each hub site has an IHE employee that works full-time on the CC campus, recruiting, advising, marketing in the region • Cohort model used; 14 current cohorts exist • On-line delivery; same program taught by same faculty • Part-time delivery model requires 3 ½ years to finish “2” • 443 graduates; 77% teaching within N.C.; 95% teaching in rural eastern N.C.

  7. Implementing edTPA in a Large Elementary Education Program Comparing Program Pathways

  8. Briefly, What is edTPA? • Capstone, summative performance assessment portfolio • Links theory to practice • Includes 3-4 tasks requiring candidates to plan, instruct, and assess • Candidates must video record themselves teaching lessons they plan for a specific group of students. • Nationally validated instrument developed at Stanford University as a measure of teaching proficiency at the pre-candidate level • Results are summative for candidate and formative for programs • Currently over 25 states and 180 teacher preparation programs have adopted or considering adoption of the edTPA

  9. How is edTPA scored? • 2012 TPA Field Test Handbooks • Evaluators rate candidates’ performances on planning, instruction, assessment, analyzing teaching, and academic language in 12 rubrics. • Each item based on a 5 point scale: • 5= stellar candidate • 4= solve foundation, knowledge & skills • 3= acceptable levels to begin teaching • 2= some skills, more practiced needed • 1= struggle candidate not ready

  10. How were candidates prepared for edTPA? • Revised teacher education curricula was aligned with: • Common Core State Standards • 21st Century Skills • ISLES modules developed as part of TQP Grant Curriculum Reform • ISLES 3 aligns with edTPA Task2 • Instructional Coaching Support for Candidates in Partner Districts

  11. Sample and Data Analysis

  12. Study Sample and Methods • Utilized preexisting integrated assessment system databases on candidate performance, competence and descriptive characteristics. • JMP Pro 9 provided the ability to compare the two sections using matching student ID analysis. • Sample included 132 elementary education degree completers (74 Non-WPE and 58 WPE) Fall 2012 semester . • Dataset included teacher performance data: edTPA assessment scores, test scores, GPA, internship grades, and demographic data. • 90 participants were randomly selected for analysis, 45 WPE and 45 WPE candidates.

  13. Age Characteristics

  14. EthnicityCharacteristics

  15. Final Internship Grade Differentials

  16. Academic Characteristics

  17. edTPA Rubric Scores • Assessed candidate’s performance in each content area by comparison analysis of each rubric score. • Evaluation includes mean, standard deviation, and percent total. • Significant results: • Rubric 8- Assessment: Using feedback to guide further learning (WPE, M=3.60. Non-WPE, M=3.36) • Rubric 12-Academic Language: Developing student’s academic language and literacy (WPE, M=3.56. Non-WPE, M=3.26)

  18. WPE v non-WPE – all rubrics

  19. Rubric 12- Academic Language: Development Students’ Academic Language – All Candidates

  20. WPE v Non-WPE, with “not met” removed • Further analysis shifted from program pathway comparison to proficiency of candidates. To focus on proficiency, candidates who were not proficient on the edTPA were removed from the analysis. • Excluded 3 “not met” candidates to prevent skew data. • WPE: N=44 • Non-WPE: N=43 • Significant results: Rubric 8, 10, & 12

  21. Rubric 8- ASSESSMENT: USING ASSESSMENT TO INFORM INSTRUCTION-Proficient Candidates Competence Scale t-Test Results (95% CI) d-.0263 df-84.49 P-value-.048 χ²-2.813

  22. Rubric 10- ACADEMIC LANGUAGE: UNDERSTANDING STUDENTS’ LANGUAGE DEVELOPMENT AND ASSOCIATED LANGUAGE DEMANDS- Proficient Candidates Competence Scale t-Test Results (95% CI) d-.225 df-76.75 P-value-.062 χ²-3.718

  23. Rubric 12- Academic Language: Development Students’ Academic Language-Proficient Candidates t-Test Results (95% CI) d-.034 df-83.58 P-value-.0017 χ²-5.639

  24. Implications for Other Programs

  25. Issues to Consider when Addressing Program Pathways • Impact of online delivery • Feasibility of supervised practica? • Level of support from faculty • How do we provide supports to DE students that are provided in face to face forums? • Candidate development • Are DE students more effective independent learners in working through edTPA handbooks than nonDE candidates? If so, what traits could be utilized/taught in face to face instruction?

  26. Issues to Consider When Utilizing edTPA Data • Program gateways • Should we have stronger gateways in Junior level classes that are aligned with edTPA? • Candidates Characteristics • What is occurring in which WPE candidates outperform nonWPE candidates on Academic Language rubrics? • Individual Rubric Scores vs. Total Average • What considerations should be made when using edTPA scores? • How will this analysis change with shift to Operational Handbooks with 15 rubrics?

  27. Our Next Steps

  28. What’s NEXT? • Future research should continue to study efficacy of the model with larger population samples. • Future research should investigate the validity and reliability of our performance measures and assessments. • Future research should expand to address other recruitment and retention factors that may influence enrollments.

  29. Questions?

  30. Contact Information Ms. Laura Bilbro-Berry 252-328-1123 bilbroberryl@ecu.edu Dr. Diana B. Lys 252-328-2037 lysd@ecu.edu • For a copy of this PowerPoint presentation, please email Dr. Diana Lys Dr. Kristen Cuthrell 252-328-5748 cuthrellma@ecu.edu

More Related