1 / 19

2.3.4 MVNU Annual Education Unit & Program Assessment

2.3.4 MVNU Annual Education Unit & Program Assessment. 2006-07 Part I. MVNU Performance Assessment System. Drive. Demonstrated by. Selected Artifacs. Collected at. MVNU Conceptual Framework, SPA, and Ohio Standards. Expected Competency Levels. Program Gates. Evaluated by

freya
Télécharger la présentation

2.3.4 MVNU Annual Education Unit & Program Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2.3.4MVNU Annual Education Unit & Program Assessment 2006-07 Part I

  2. MVNU Performance Assessment System Drive Demonstrated by Selected Artifacs Collected at MVNU Conceptual Framework, SPA, and Ohio Standards ExpectedCompetency Levels Program Gates Evaluated by Generate assessment scores Department Approved Rubrics Assessment Validation & Reliability Checked for Aggregated for Used for Data Collected, Analyzed: Chalk & Wire, University Database Program/Unit Review & Improvement: Reported by Research Questions driven by Conceptual Framework and Ohio Standards Candidate Admissions, Retention, and Program Completion

  3. Reviewing How the System Works with Additions & Changes • Unit Assessments • Includes both Departments • Only Key Assessments reported • Assessments are aggregated from multiple measures at different transition points (Gates) • Program Assessments • Assessments disaggregated by program area • Inform the Unit of candidate performance by program

  4. Logistics • 2005-2006 • Transition from assessments based solely on Department Outcomes to ITASC Standards • 2006-2007 • Transition form INTASC Standards to Ohio Standards for the Teaching Profession • 2007-2008 • All assessments on Chalk & Wire or CARS!

  5. How will last year’s (2006-07) report look? • Answer: Much like the reports for Fall Semester 2006. • Most significant difference is the linkage to Ohio Standards for the Teaching Profession • Let’s look at major Unit Assessment Tables (W/O the data)

  6. Unit Key Assessments by Gates (Transition Points)

  7. Unit Performance Triangulation Table

  8. Candidate Performance by Program

  9. Additional Reports byProgram • Candidate Demographics (C&W) • Interrater Reliability (C&W) • Rubric Scores by Percentage, Mean, Median, Std. Dev. (C&W) • Scores by Licensure Area; by Gate (C&W) • Customized Reports upon request

  10. Are Candidates Ready for the Gates? Chalk & Wire is Ready… CARS is Ready…

  11. Department action following candidate review at each Gate: • Pass The candidate progresses to the next Program Gate or completes the program. • Pass/ConditionalThe candidate continues in the Program while all remaining Gate requirements are completed. • PendingThe candidate remains at the Program Gate until the successful completion of a Department Action Plan. • FailCandidate dismissed from the Program.

  12. Next Step: Part II • An analysis and review of Unit data for: • 2005-06 • 2006-07 • Preparation of 2007-08

  13. MVNU Annual Education Unit & Program Assessment 2006-07 Part II

  14. 2006-07 Unit Data Report • Points to consider: • Within Gate Data • Across Gates Data • Across Years Data • Unusually High Scores • Unusually Low Scores • Review: what does the Triangulation Table tell us?

More Related