1 / 21

“Performance Reports for Failing Candidates ”

“Performance Reports for Failing Candidates ”. Carol O’Byrne Pharmacy Examining Board of Canada. Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona. What failing candidates want to know. How close was I to passing? What did I do wrong? What did I miss?

humphrey
Télécharger la présentation

“Performance Reports for Failing Candidates ”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. “Performance Reports for Failing Candidates ” Carol O’Byrne Pharmacy Examining Board of Canada Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  2. What failing candidates want to know • How close was I to passing? • What did I do wrong? What did I miss? • How many such errors and omissions lead to a failing result? • In which area(s) do I need to improve? • What does PEBC expect in these areas? • Why am I expected to perform at a higher level than what I see some pharmacists doing? Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  3. PEBC rationale for providing feedback to candidates • Supports PEBC’s mandate: to certify candidates who demonstrate that they have the knowledge, skills, abilities and attitudes required for practice • Increases candidates’ awareness of practice requirements • Supports the cooperative but arms length relationship between credentialing bodies and training bodies Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  4. Rationale… Benefits all parties: • Assists candidates to recognize and address their weaknesses • Improves efficiency of PEBC processes and lessens potential threat on exam security by reducing the number of retakes • Benefits the profession and the public by supporting further development of qualifications of those preparing to enter practice • Addresses manpower needs - guides remediation and bridging efforts, facilitating earlier entry to the profession of those who may not yet have received adequate training Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  5. Why only to failing candidates? • No demand from passing candidates • Resource issues • Issuance of reports • Failing candidates often retake the exam without appropriate preparation and plug the system Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  6. Based on national competencies and standards Offered in English and French Must be PEBC certified to license in 9/10 provinces Mobility enabled by mutual recognition (if PEBC certified) Part I (MCQ) – 200 scored items Part II (OSCE) – 15 scored stations 12 SP/HP interactions + 3 non-client stations 7 minutes/station 1 assessor/station 2 sets of scores/station Analytical checklist Holistic scales PEBC Qualifying Examination Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  7. Competencies assessed Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  8. Test format • Interactive client stations • Standardized patients • Standardized health professionals • Non-client stations • Technical, e.g.: • Screening prescriptions for appropriateness • Checking dispensed prescriptions • Written short answer, e.g.: • Responding to drug information requests - evaluating and interpreting drug information from several / conflicting sources • Medication management - reviewing patient data and recommending therapeutic options, along with a rationale Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  9. Assessor scoring sheet - ratings Three 4-point scales • Communications – generic scale • Rapport • Organization and flexibility (adaptive to the client/situation) • Verbal and nonverbal skills (including language proficiency) • Outcome (problem solving)– station specific scale • Based on critical checklist items • Overall Performance– inclusive, global scale • Communications and outcome • Process quality and thoroughness (critical and noncritical items) • Accuracy (vs misinformation) • Risk (occurrence, degree) Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  10. Assessor scoring sheet - checklist • ‘Critical’ items () • essential to solve the problem & meet station objective/s • each linked to a competency assessed in the station • ‘Noncritical’ items • represent good practice & contribute to effective outcome(s) • each linked to a competency • Risk and misinformation • Unique response (UR) - for scoring & QA purposes • Comment boxes - to record evidence to support scores (used for QA purposes) Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  11. Analytical scores Each checklist item relates to one competency Competency sub-scores = percent of items related to each competency to which candidate responds Frequency of risk and misinformation tabulated Holistic scores Each scale 1 to 4 points 12 points per client station (Comm, Outc, Perf) x 12 stations 8 points per nonclient station (Outc, Perf) x 3 stations Raw score = sum of all stations holistic scale scores Holistic cut score set for each scale in each station Cut score = sum of all stations holistic cut scores Scoring the examination Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  12. Mean scores & alphas Holistic Scales Coefficient • Communications .83 12 stns - competency 4 • Outcome .66 15 stns – all competencies • Performance .73 15 stns – all competencies Analytical Scoresn itemsCoefficient  • Pharm care 107 .80 • Ethics 7 .43 • Drug information 8 .24 • Communications 14 .33 • Drug distribution 8 .57 • Management 8 .55 Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  13. Factors affecting competency sub-score reliabilities • Candidate variability (or lack of) • Number and context of stations in which the competency was assessed • Number of non-critical items vs critical items (importance of their performance to the task at hand) Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  14. Reports to candidates • Results: pass-fail status (all candidates) • Feedback (for failing candidates, on request): • Individual score breakdown • by major skill – mean Communication, Outcome and Performance ratings – aggregated across all stations • by competency – mean percent scores – aggregated across all stations in which the competency was assessed • by critical incident – frequency of risk, misinformation • Comparative data • ‘Reference group’ mean scores and frequencies for score comparison with a stable population • to show where performance needs to improve Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  15. Assessor scoring sheet Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  16. OSCE feedback report Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  17. Candidate findings • Most candidates understand the information provided but want more guidance (content information – where they went wrong) • Some do not accept the exam results and feedback information – may request hand-scoring • Failing candidates generally score low in Communications (rating scale and competency 4) and/or Pharmaceutical Care (competency 1 – clinical role) • Many failing candidates lack clinical training in Canada (or the US) - though many have some technical training / experience (as a pharmacy technician) Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  18. Are we really helping candidates? • Anecdotally, yes – some do not know where to start, what to focus on • Skills scores and competency sub-scores are consistent enough to be meaningful in areas that are weighted more heavily • All candidates who fail show weaknesses in one or more of these areas (low scores relative to the reference group) Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  19. What questions do (can) we answer? • What area(s) do I need to improve? • What does PEBC expect in these areas? • What did I do wrong? What did I miss? • Why am I expected to perform at a higher level than what I see some pharmacists doing? • How many errors and omissions lead to a failing result? • How close was I to passing? Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  20. What other strategies are (may be) helpful? • Provide information about training and/or remedial resources, e.g.: • Clear expressions, including visual exemplars, of good practice in each competency area • Recognized training programs and resources • Practice exams (e.g. ‘mock OSCE’s’) for format familiarization • Provide general tips, e.g.: • Typical performance errors/deficits in each competency • Competency-related descriptions of candidates who are clearly qualified, borderline qualified and unqualified Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

  21. Contact information Carol O’Byrne Pharmacy Examining Board of Canada 415 Yonge Street, Suite 601 Toronto, ON M5B T: 416-979-2431, ext 226 Email: obyrnec@pebc.ca Website: www.pebc.ca Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona

More Related