170 likes | 275 Vues
Explore the process and strategies for improving assessment outcomes at Helen Thumann Department of Education, focusing on Praxis exams and Teacher Work Samples for students' success in the program and licensure. Learn about data analysis, student support, and ongoing program revisions.
E N D
Using Assessment Data Helen Thumann Department of Education
Some Background • The Dept of Ed has 8 accredited programs • 7 Undergraduate • 1 Graduate • Each accredited by its own professional association • Each professional association has its own standards • Each program has 6 to 8 required assessments that we use for our accreditation review
Dept of Ed Assessments • These include • Praxis exams (1 and 2) • Teacher Work Samples • Content assessments • Planning Rubric • Student Teaching Evaluation • Disposition Assessment
The Process • We have 4 transition points where each student must submit a portfolio and/or complete performance assessments • Acceptance to program • Acceptance to practicum • Acceptance to student teaching • Completion of program
The Process • Each candidate is reviewed and evaluated by at least 2 (usually 3) faculty members at each point • Cut scores for each assessment are set for each transition point • Students who meet the cut scores move on • Students who don’t are usually allowed a chance to re-do • Students can be dismissed from the program or graduate with out recommendation for teaching license
The Process • At least once a year we review the scores for all assessments at the transition points to look for trends and/or problems • We use this information to make program revisions • Two Examples: • The Praxis exams • Teacher Work Samples
The Praxis Exams • Praxis 1 exams – Pre-professional Skills Test • Reading • Writing • Mathematics • Our accrediting groups require 80% pass rate on each of the three exams • In 2001-2002 academic year our pass rate was 47%
What we did • Began requiring students to take the exam • Began providing support for students • Workshops on preparing for praxis • Study groups • Tutoring • We aggregated and analyzed the scores • Found that students who scored below a specific range did not usually pass by graduation
What we did • We began requiring students to have above that score on at least 2 exams to enter the program • We began requiring students to pass at least 2 of the Praxis 1 exams before student teaching • In 2006-2007 our Praxis 1 pass rate was 82% • This spring our pass rate for student teachers • Reading 80% • Writing 100% • Math 100%
Next Steps • We are working to get support for students in Praxis preparation BEFORE they get into the program • Focusing on the Praxis 2 pass rates and working with Content areas
Teacher Work Sample (TWS) • Students must complete a Teacher Work Sample based on an assess – teach – assess cycle in their internship site • The TWS has 7 factors • Contextual Factors • Goals & Objectives • Assessment Plan • Design for Instruction • Instructional Decision Making • Analysis of Learner Progress • Reflection & Self Evaluation
TWS (continued) • We began piloting the TWS with student teachers in 2006. • That year our data was incomplete • In fall 2006 we began including pre-TWS be completed in courses prior to student teaching • At least 3 to 4 semesters prior • Students must complete at least 2 pre TWS before Student Teaching • We required students to attend a workshop about the TWS
TWS (Continued) • In Summer 2008 TWS scores were compiled and analyzed based on 19 students • Average scores ranged from 74% to 90% on all factors. • Factors with lowest scores – • Factor 3 (Assessment Plan) • Factor 6 (Analysis of student learning) • Factors with highest scores- • Factor 1 – Contextual Factor • Factor 2 – Goals and Objectives
TWS (continued) • Did an analysis of who scored what by grad/undergrad status, hearing status, major, program, evaluator and a few other categories • Found that • Graduate students did better than UG students • Deaf Faculty scored harder than hearing faculty • One program had 100% inter-rater reliability
Next Steps • Continue to address struggles with Factor 3 and 6 • Work on inter-rater reliability for assessments • Keep monitoring all data for future program changes