1 / 40

“Improving Communication Assessment”

“Improving Communication Assessment”. John Pugsley Carol O’Byrne The Pharmacy Examining Board of Canada (PEBC). Session Objectives. Discuss the importance of assessing oral Communication Explain how PEBC assesses Communication

gigi
Télécharger la présentation

“Improving Communication Assessment”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. “Improving Communication Assessment” John Pugsley Carol O’Byrne The Pharmacy Examining Board of Canada (PEBC)

  2. Session Objectives • Discuss the importance of assessing oral Communication • Explain how PEBC assesses Communication • to infer, from ratings of candidates’ communication, that candidates have (or do not have) the communication ability needed to positively influence client outcomes • Discuss problems encountered • Involve you in assessing Communication • Review Communication assessment research • Discuss future plans and options to improve Communication assessment CLEAR 2008 Annual Conference Anchorage, Alaska

  3. Why is Oral Communication Important? • In general, client outcomes are directly related to the effectiveness of communication • In pharmacy, most of the communication with patients and other health professionals is oral • The pharmacist communicates with patients (and other health professionals) in order to: • ensure that medications to be used by a patient are appropriate for that patient • enable patients to use their medications safely and effectively, to achieve optimal health outcomes CLEAR 2008 Annual Conference Anchorage, Alaska

  4. Communication in Pharmacy Practice To impact positively on a patient’s optimal health and well-being, a pharmacist has a responsibility to: • Establish an effective rapport with the patient • Develop mutual trust and confidence - with the patient, caregivers and other professionals • Consult with the patient to determine the patient’s needs, limitationsand values • Collaborate with the patient and other health professionals todetermine the optimal treatment plan • Educate and support the patient • to use medications and other therapies safely and effectively • to adopt healthy lifestyles • to self-monitor the effectiveness of therapy and adverse effects • to obtain further help if needed CLEAR 2008 Annual Conference Anchorage, Alaska

  5. How does Poor Communication Impact Patients/Clients? • Failed communication between pharmacists and patients may lead to unsafe medication use • Unclear messaging between pharmacists and other health care providers may lead to errors in medication therapy management CLEAR 2008 Annual Conference Anchorage, Alaska

  6. Why Examine Oral Communication? • To observe and evaluatecandidates’ Communication skills • in a standardized manner (interactions, scoring) • in selected contexts (settings, clients, challenges) • To make valid inferences … • If certification/licensure exams test competencies that are critical, including Communication, and • if scores are reliable and generalizable and pass/fail decisions are dependable then we can infer that successful candidates have what is needed to achieve safe and effective practice outcomes CLEAR 2008 Annual Conference Anchorage, Alaska

  7. WHAT IS AN OSCE? Objective, Structured, Clinical Exam • Simulations of common, critical professional tasks • Series of stations through which all examinees rotate • examinees perform professional tasks • behaviours are observed & evaluated • Stations / tasks involve interactions withstandardized clients/patients/health professionals • individuals trained to portray an important interpersonal situation consistently and repeatedly CLEAR 2008 Annual Conference Anchorage, Alaska

  8. Based on national competencies (NAPRA) Pharmaceutical care (29%) Ethical, legal, professional issues (9%) Drug information (5%) Communication (43%) Drug distribution (9%) Practice management (5%) Fifteen, 7-minute “stations” common/critical situations 10 patient encounters 2 health professional interactions 3 non-client interactions 1 pharmacist assessor per station Administered twice yearly 650 + Canadian grads 900 + foreign-trained pharmacists The PEBC OSCE CLEAR 2008 Annual Conference Anchorage, Alaska

  9. CLEAR 2008 Annual Conference Anchorage, Alaska

  10. Candidates’ Scores • Scored with 3 holistic scales, 4 anchors each: Unacceptable  Marginally Unacceptable   Marginally Acceptable Acceptable • Communication – same scale for all interactive stations • Outcome – global scale guided by case-specific critical checklist items (required to solve the station problem) • Performance – global scale reflecting • Communication • Outcome • Thoroughness (non-critical items) • Accuracy of information (misinformation) • Risk to patient (if not performed adequately) CLEAR 2008 Annual Conference Anchorage, Alaska

  11. Rating Communication - 4 Anchored Subscales Client centredness, e.g. sets client at ease, establishes rapport and trust shows genuine interest in client’s concern listens and makes responses specific to client involves client in discussion/decision making Process, e.g. stays on track, coherent, makes smooth transitions flexible, adjusts to client’s input, etc. Non-verbal skills, e.g. no distractions appropriate body posture (faces SP/MD) Verbal skills, e.g. uses suitable vocabulary, grammar,pronunciation, volume, pace, and tone expresses ideas clearly, in terms CLEAR 2008 Annual Conference Anchorage, Alaska

  12. Supportive Written Comments Rating: Unacceptable Comments: relevant to three or four factors • Client-centredness: not client-centered - should respond more specifically to client • Process: very disorganized • Non-verbal: looked at books while speaking • Verbal: didn’t use lay terms - said “prophylaxis”, other medical words; read out of brochure; choppy, stopped speaking mid-sentence CLEAR 2008 Annual Conference Anchorage, Alaska

  13. PEBC Prior Research - Scoring On a 15-station OSCE – across all scales • Negligible error due to pharmacist- assessors, using either holistic or analytical (checklist) scoring schemes • Holistic ratings are more consistent than checklist ratings • One pharmacist-assessor per station yielded consistent, generalizable, dependable holistic scores (alpha, G and D ~ 0.9) CLEAR 2008 Annual Conference Anchorage, Alaska

  14. PEBC Prior Research - Scoring • Rating consistency was higher between two pharmacist assessors than between a pharmacist-assessor and the SP simulator when rating Communication • SP simulators should not replace pharmacist-assessors for rating candidates’ performance (including Communication) • Consistency of rating scales • Communication - lowest consistency CLEAR 2008 Annual Conference Anchorage, Alaska

  15. Ratings & Comments – Candidate A Assessor 1 Rating: ‘Unacceptable Marginal’ (2/4) • Comment: “Candidate did not understand the sequence of events, suggesting solving the problem by calling the doctor; minimized the seriousness of drug allergy.” Assessor 2 Rating: ‘Unacceptable Marginal’ (2/4) • Comment: “Quote: Did the doctor prescribe this for the rash? This kind of drugs can make XXX allergy. You have sensitivity/allergic rashes...” SP (simulator) Rating: ‘Unacceptable Marginal’ (2/4) • Comment: “Did not listen very well and told me XXX was prescribed for YYY. Smiled a lot when talking about symptoms (inappropriate)”. CLEAR 2008 Annual Conference Anchorage, Alaska

  16. Ratings & Comments – Candidate B SP Rating: ‘Unacceptable Marginal’ (2/4) • Comment: “I was not sure what I was supposed to do, when to see the doctor, whether or not to continue the medication or when I should be concerned”. Assessor Rating: ‘Unacceptable Marginal’ (2/4) • Comment: “…was chewing gum when talking to patient and rocking the chair”. CLEAR 2008 Annual Conference Anchorage, Alaska

  17. Ratings & Comments – Candidate C SPR Rating: ‘Unacceptable Marginal’ (2/4) • Comment: “Asked many questions, rambled, cut patient off, shifted in seat throughout interaction.” PA Rating: ‘Acceptable Marginal’ (3/4) • Comment: “Rushed, hard to follow his recommendations.” CLEAR 2008 Annual Conference Anchorage, Alaska

  18. PEBC’s Communication Assessment Challenges Communication assessment tools • May lack clarity – communication behaviours not easy to describe • Descriptors are ‘indicators’ – they do not capture the whole construct • Training tool differs from the exam tool • Multi-factorial – difficult to recognize and assimilate into one rating (‘connoisseurship’) CLEAR 2008 Annual Conference Anchorage, Alaska

  19. PEBC’s Communication Assessment Challenges • Assessors and raters differ in their ability to • Interpret and apply the assessment criteria e.g. Acceptable rating – with comment: “Spoke very quickly; SP could not get a word in” • Ignore non-relevant behaviors e.g. ‘Seemed nervous’ • Evaluate a multi-dimensional construct (processes & impact) • Separate Communication (process and impact) from Outcome (station-specific content and outcome) CLEAR 2008 Annual Conference Anchorage, Alaska

  20. How Does PEBC Assess Communication? • Tools • Communication scoring sheet – factor scales with descriptors, summative scale • Station scoring sheet – checklist, rating scales, comments • Orientation and Training • Assessor web site: videos, station scoring sheet, scoring key • Pre-exam Orientation & Exam Day: PPT, DVD for practice, Communication and Station scoring sheets, feedback • Exam Administration • Station scoring sheet only CLEAR 2008 Annual Conference Anchorage, Alaska

  21. Let’s try it! • Brief orientation and training • Review Communication rating criteria • Score DVD Performances • Compare & deliberate results (each table) • Communication factor ratings • Communication overall rating • Similarities and differences (report) • PEBC scoring key and rationale CLEAR 2008 Annual Conference Anchorage, Alaska

  22. Assessor Orientation & Training • Review Communication rating guidelines (handout) • Review case (handout) • Watch performance #1 (DVD) • Complete station checklist (station scoring sheet) • Complete Communication rating form • Assign an overall Communication rating(rating form) • Consider all four factors to be equivalent, unless the station requires a particular Communication skill or poses a particular challenge • If rating is Unacceptable/Marginal or Unacceptable, write the reason/s in the Comments box CLEAR 2008 Annual Conference Anchorage, Alaska

  23. Rating Communication - 4 Anchored Subscales Client centredness, e.g. sets client at ease, establishes rapport and trust shows genuine interest in client’s concern listens and makes responses specific to client involves client in discussion/decision making Process, e.g. stays on track, coherent, makes smooth transitions flexible, adjusts to client’s input, etc. Non-verbal skills, e.g. no distractions appropriate body posture (faces SP/MD) Verbal skills, e.g. uses suitable vocabulary, grammar,pronunciation, volume, pace, and tone expresses ideas clearly, in terms CLEAR 2008 Annual Conference Anchorage, Alaska

  24. acknowledges/greets client in a timely/professional manner focuses on client’s specific concerns encourages questions, checks client’s understanding makes responses specific to client invites client’s responses in discussion / decision making acknowledges/validates client’s feelings & needs (empathy) is non-judgemental, unbiased / sensitive to cultural differences does not greet client in a timely/professional manner does not focus on client’s specific concerns does not invite questions generalizes, monopolizes, lectures does not involve patient in discussion/decision making ignores / does not validate client’s feelings and needs; is glib is judgemental, biased / insensitive to cultural differences Attends/responds to client’s needs/feelings/concerns, with a professional manner that treats clients respectfully CLEAR 2008 Annual Conference Anchorage, Alaska

  25. Candidate #1 • Score independently • Fill in the Station checklist as you watch the DVD • Rate each factor on the Communication scoring sheet afterward • Based on these ratings assign an overall Communication rating CLEAR 2008 Annual Conference Anchorage, Alaska

  26. Candidate #2 • Score independently • Complete the station checklist • Rate Communication – lower left box (without completing the Communication rating form) • If rating is low (UM or U) write comments • what you observed • possible impact on the patient CLEAR 2008 Annual Conference Anchorage, Alaska

  27. Deliberate… • Communication ratings for Candidate #1 • Factor ratings • Overall Communication rating • Comments • Communication ratings for Candidate #2 • Overall Communication rating • Comments • Value of Communication rating form? CLEAR 2008 Annual Conference Anchorage, Alaska

  28. What is your experience? • What thoughts do you have about communication assessment and using these tools a result of doing these exercises? • How important is it to assess communication in your profession? If you are assessing communication, how are you doing it? CLEAR 2008 Annual Conference Anchorage, Alaska

  29. Our Ongoing Challenges Defining Communication • Break factor 2 out into pharmacists’ communication tasks? e.g. interview patients, counsel on medication use, promote healthy lifestyle, etc. • Add ‘setting the stage’ – ‘concordance’? • developing a shared understanding of the patient’s goals, value of the medication and how to implement therapy (including monitoring and lifestyle changes if appropriate) CLEAR 2008 Annual Conference Anchorage, Alaska

  30. 2007-2008 Research Questions Can we improve Communication rating consistency by • clarifying the criteria? • separating communication and content assessment? • using separate tools? • using other raters? SPs? SP raters? • enhancing training? • communication and what it looks like • more practice with feedback • other? CLEAR 2008 Annual Conference Anchorage, Alaska

  31. 2007-2008 Research CLEAR 2008 Annual Conference Anchorage, Alaska

  32. Fall 07 Results Raters (observers) • Mean ratings: PA (3.05); RPA (3.42); SPR (3.31) • PA1 ratings were generally the lowest • PA2 and SPR ratings were generally higher than PA1 ratings • When rounded to the nearest scale point, they were the same (in 10 of 12 comparisons and overall) Inter-rater Consistency (Communication Rating) between • PAs & SPRs: low in 3 of 4 stations (0.04 to 0.82) • SPs & SPRs: generally higher (0.41 to 0.80) • PAs & PAs: highest (0.54 to 0.77) More than one pharmacist-assessor would be required to achieve consistency in Communication ratings on par with that of Outcome and Performance. CLEAR 2008 Annual Conference Anchorage, Alaska

  33. Rater training and experience Assessors and RAs had previous experience rating Communication - SPs and SPRs did not Some SPs and SPRs involved in teaching Communication – different perspectives? Pharmacists not formally trained in Communication Focusand complexity of exam tasks SPs are active participants; others are observers Assessors completing station checklist and all ratings may confound Communication and content may miss some visual cues Rater Considerations CLEAR 2008 Annual Conference Anchorage, Alaska

  34. Spring 2008 Study - Refinements • Rating criteria • Clarified, describing observable behaviors • Factors more clearly distinguishable • Assessor and rater selection • All had prior experience, except for one SPR and a few SPs (simulators) • Most SPRs were the same as in Fall 07 • Training tools – criterion wording • Assessor and rater training • Participants oriented to study purpose and method • New DVDs, more discussion / reflection • Examples of comments reflecting performance in each factor • Increased emphasis on comments - observations and impact CLEAR 2008 Annual Conference Anchorage, Alaska

  35. Preliminary Spring 08 Findings • Assessor/Rater/Coordinator Feedback • Much smoother than Fall 07 (training and exam day) • Prior rating experience gave SPs and SPRs confidence • Training enhancements & new resources were useful to experienced assessors and raters • Rating criteria were clearer • Comments • More comments were documented • Comments were more specifically related to factors in the scale and impact on the patient CLEAR 2008 Annual Conference Anchorage, Alaska

  36. Research Comments • This study was based on small sample sizes (from 34 to 74 candidates). • Further studies need to be done regarding non-pharmacists’ ability to rate candidates’ Communication before assessors can take the place of pharmacist assessors in rating Communication. CLEAR 2008 Annual Conference Anchorage, Alaska

  37. The consistency andreliability of Communication ratings should be improved further Improve instrument design & rater training clarify rating criteria; relate to Canadian pharmacy practice more practice and feedback Consider having raters complete aCommunication checklist (to inform the rating) and a Communication rating, Consider combining SP and Assessor ratings (SPs’ ratings may be more independent) Recommendations CLEAR 2008 Annual Conference Anchorage, Alaska

  38. Recommendations Training strategiesmight include: • Continue to focus on standardizing the trainers (who train raters) • Involve communication experts • More benchmarking exemplars and exercises • Refined training materials for assessor/rater training • Additional rater training • more practice – training sessions, online • reflection - documentation of rationale for all ratings • feedback – compare with rating key and rationale • Online Communication training (assessors, raters and candidates?) CLEAR 2008 Annual Conference Anchorage, Alaska

  39. References • Quero-Munoz, L, O'Byrne, C, Pugsley, J, and Austin, Z. (PEBC). Reliability, validity and generalizability of an objective structured clinical examination (OSCE) for assessment of entry-to-practice in pharmacy. Pharmacy Education. 5(1):33-43, March 2005. • Hodges, B et al. Analytic global OSCE ratings are sensitive to level of training. Medical Education 2003; 37:1012-1016. • Humphrey-Murto, S, Smee, S, Touchie, C, Wood, TJ, and Blackmore, DE. A comparison of physician examiners and trained assessors in a high-stakes OSCE setting. Academic Medicine. 80(10) Supplement:S59-S62, October 2005. • Makoul, G, Curry, R. The value of assessing and addressing Communication skills.JAMA. 2007; 298(9):1057-1059. • Schneider, B. Clarity in Context: rethinking misunderstanding. Technical Communication. 59(2), May 2002. • Tamblyn, R et al. Physician Scores on a National Clinical Skills Examination as Predictors of Complaints to Medical Regulatory Authorities. JAMA. 2007;298:993-1001. CLEAR 2008 Annual Conference Anchorage, Alaska

  40. Speaker Contact Information Dr. John Pugsley, Registrar-Treasurer Carol O’Byrne, Manager, PEBC QE-II (OSCE) Pharmacy Examining Board of Canada 717 Church Street Toronto ON M4W 2M4 Email: obyrnec@pebc.ca Tel: 416-979-2431, ext 226 Web site: www.pebc.ca CLEAR 2008 Annual Conference Anchorage, Alaska

More Related