1 / 15

Sterling Practices in Design & Scoring of Performance-Based Exams #156

Sterling Practices in Design & Scoring of Performance-Based Exams #156. F. Jay Breyer Jay.breyer@thomson.com. Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona. The Players. F. Jay Breyer, PhD Thomson Prometric Ron Bridwell, PE

gitano
Télécharger la présentation

Sterling Practices in Design & Scoring of Performance-Based Exams #156

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sterling Practices in Design & Scoring of Performance-Based Exams#156 F. Jay Breyer Jay.breyer@thomson.com Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  2. The Players • F. Jay Breyer, PhD Thomson Prometric • Ron Bridwell, PE National Council of Examiners for Engineering & Surveying • Beth Sabin, DVM, PhD American Veterinary Medical Association • Ron Rodgers, PhD CTS/Employment Research & Development • Elizabeth Witt, PhD American Board of Emergency Medicine Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  3. Scoring Procedures for STRUCTURAL II Ron Bridwell, P.E. September 2005

  4. Introduction The Exam Before the Scoring Session The Scoring Process The Cut Score (Passing Point) Process Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  5. Introduction The Exam Before the Scoring Session The Scoring Process The Cut Score (Passing Point) Process Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  6. Introduction The Exam Before the Scoring Session The Scoring Process The Cut Score (Passing Point) Process Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  7. Introduction The Exam Before the Scoring Session The Scoring Process The Cut Score (Passing Point) Process Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  8. The Exam Scoring Protocol Development • Need to Standardize the STR II Scoring guidelines using a benchmark holistic method. • Scoring can drift due to fatigue or anger. Scoring Criteria Development • Developed by the exam committee as the problems are developed. • Candidates may respond differently. Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  9. Before the Scoring Session Tasks Identifying Scoring Committee Members • Most familiar with problems • Coordinators work with staff • Empowered to modify criteria as needed. Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  10. Before the Scoring Session Tasks Identify Sample Papers • 5 benchmarks for training • Range finders for training • 5 benchmarks for certification Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  11. The Scoring Process Tasks Training the Scorers • Scorers should be skilled at assigning scores to specific problems • Scorers are trained with benchmark papers Certifying Scorers • 5 benchmark papers are given to scorers as test • Pass or Fail • Scorers have two chances to be certified Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  12. The Scoring Process Tasks Scoring • Care is taken to insure the scorers do not know the names or jurisdictions of the examinees • Papers are scored blind as if by machine • Each paper is scored by two scorers • If the scores agree or are off by no more than 1 the score is assigned (averaged) • If off by more than 1, the coordinator adjudicates • Any scorer can be replaced by any other and the same score would result • Database provides feedback Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  13. Kinds of Information Monitoring Solution for Fair Scoring: ReportComponents Adjudication Resolution Training • Number of Papers to be adjudicated • Total Required Adjudications by Scorer • Re-Training may be necessary if too many Discrepancy Road • Shows how many papers scored • Shows consistency • Shows consistency of each scorer paired with all partners • Useful for Scoring Reliability • Aggregate & Separate Agreement To Fair & Quality Scores • Number, Mean, SD read by each scorer & coordinator by problem • Number, Mean, SD across entire test for each scorer and coordinator • Book keeping • Records Summary Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  14. Overview ofStandard Setting Process 3 Uniform Solution Samples Selected Definition of Competence Training undertaken Practice Session Assign candidates to PASS/FAIL status based on comparison of total performance to Standard Real Rating Report Results Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  15. Questions? Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

More Related