1 / 25

360 Degree Evaluation

360 Degree Evaluation. Craig McClure, MD May 15, 2003 Educational Outcomes Service Group. Description. Use of rating forms to report frequency of observed behavior Multiple people in contact with resident act as evaluators Often survey type form Ratings summarized by topic

redford
Télécharger la présentation

360 Degree Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 360 Degree Evaluation Craig McClure, MD May 15, 2003 Educational Outcomes Service Group

  2. Description • Use of rating forms to report frequency of observed behavior • Multiple people in contact with resident act as evaluators • Often survey type form • Ratings summarized by topic • Include goal-setting

  3. Background • Human resources in business • ACGME found no published reports of use in GME

  4. Use for “Soft” Areas • More accurate for formative than summative feedback • Interpersonal & communication • Professional behavior • Limited • Patient care • Systems-based practice

  5. Decision to Utilize • Accepted and used by residents, faculty, staff? • Develop or purchase? • Cost? • Who are the raters? • How will the tool be used?

  6. Decision to Utilize (2) • To whom is the information available? • What core competencies will be evaluated with this tool? • How nurture trust the process remains confidential? • Platform of evaluation

  7. Acceptance • Will all potential evaluators fully participate? • Will raters be fair & honest? • Will residents accept the feedback from non-faculty?

  8. Develop or Purchase • Development permits tailoring • Development time may be considerable • Purchasing gives a ready-made product • Purchasing: computer based

  9. Developing • Expert in educational testing • Programming expertise • Pilot period

  10. Purchase • Items measured appropriate? • Does it perform as claimed? • Inter-rater reliability? • Degree of support and ability to customize

  11. Cost • If purchasing, monetary cost • If developing, personnel support • Data management system • Personnel time to complete forms • Annual development plan

  12. Cost (2) • Addressing EEOC/grievance complaints • Handling disputes over data • Divisive & counterproductive for those resistant

  13. Personnel Evaluation Time • 5 to 10 nurse evaluators per resident to give reproducible results • More for faculty • More for patients

  14. Identify Raters • Patients (how explain process) • Nursing staff • Clerical staff members • Physician faculty members • Non-physician faculty members • Residents

  15. Identify Raters (2) • Medical students • Allied Health Personnel • Self-assessment

  16. Patients as Raters • Literacy • Language • Culture (medical and otherwise) • Personality

  17. Intended Utility • Intervals: monthly, quarterly, yearly • Summative versus formative • To support high stakes decisions?

  18. Access to Information • Resident • Advisor • Program Director

  19. Confidentiality & Trust • Raters require anonymity • Residents require confidentiality • Both need the process to be positive & constructive • Prior history conditions expectations • Education to process aids current participation

  20. Platform of Evaluation • PDA • Paper • Computer

  21. Challenges • Securing appropriate instruments for variety of evaluators • Managing data successfully

  22. Advantages • Electronic database for documentation • Ease of access for raters • Rapid turnaround for feedback • “Gap” analysis (self perception versus image of others)

  23. Disadvantages • Hardware/software costs • Lack of validation in GME • Potential information overload • Selection bias • Discoverability • Potential for invalid feedback

  24. References • Assessment of Communication and Interpersonal Skills Competencies, C.C. Hobgood, et.al. Academic Emergency Medicine 2002;9: 1257-69 • ACGME/ABMS Joint Initiative Toolbox of Assessment Methods, September 2000

  25. References (2) • 360-degree Feedback, K.G. Rodgers,et.al. Academic Emergency Medicine 2002;9:1300-1304 • Letter from ADFM listserv, Goldsmith to Kikano

More Related