1 / 73

Now That We Have an Institutional Report Card---How Do We Use It?

Now That We Have an Institutional Report Card---How Do We Use It?. Conference Session: SES010 2012 ACGME Annual Education Conference Ann Dohn, MA, DIO, and Nancy Piro, PhD, Program Manager/Education Specialist Department of Graduate Medical Education Stanford Hospital & Clinics.

Télécharger la présentation

Now That We Have an Institutional Report Card---How Do We Use It?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Now That We Have an Institutional Report Card---How Do We Use It? Conference Session: SES010 2012 ACGME Annual Education Conference Ann Dohn, MA, DIO, and Nancy Piro, PhD, Program Manager/Education Specialist Department of Graduate Medical Education Stanford Hospital & Clinics

  2. Disclosure • No conflicts of interest to report

  3. Session Objectives At the end of this session, participants will be able to: 1) Effectively use objective data in Balanced Report Cards to make Institutional and Programmatic decisions. 2) Understand strategies for gaining Institutional Administration (C-Suite) support for DIO/GMEC Decisions based on balanced report cards 3) Predict future ACGME review outcomes based on empirical data from balanced report cards

  4. Setting the Stage • How does GME administration function in 2012? • What are the barriers? • What are the challenges? • How can we be heard by the C-Suite? • What language should we speak? • How do we optimize success?

  5. How does GME Administration Function? • How do we know? • What do we measure? • What management strategies do we use?

  6. Historically….. • GME has not used objective data in making many decisions • Instead, decisions were often made based on: • Politics • Financial considerations • Special deals • Hunches • Ability to scream loudly and frequently • Fear

  7. But what is going on in the environment? • Evidenced based • Balanced score cards • Increased accountability for resources • Transparency

  8. Fitting GME within the overall business culture • Remember…your “C-Suite” colleagues often have that magical MBA ….. • They think differently • They use a different language

  9. Fitting GME within the overall business culture • We need to understand how the C-Suite thinks • AND learn to think the same way… AND speak their language… To gain their support!!!

  10. What Considerations does the C-Suite have with Respect to Data? • Data must have: • An Organizational focus • Regular Means of Data Collection • Include External Assessments as well as Internal Assessments • Data used must be • Standardized (comparable) / lean • Capable of transformation • Aligned with mission statement

  11. DIOs need to be able to make decisions based on comparative data • Data must: • Be Organizationally Focused • Have Regular Means of Data Collection • Include External Assessments as well as Internal Assessments • Standardized • Capable of transformation • Aligned with mission statement

  12. Conclusion • CEOs and DIOs have the same requirements with respect to their data needs

  13. But what models for data do we have? • Few data models exist for GME • The concept of Institutional Accountability is relatively new. • Until the ACGME Outcome Project, there was no centralized curriculum oversight in GME, unlike medical schools / UME.

  14. The “Report Card” Vision • In 2005, Stanford hired its first PhD in GME • The vision was to develop tools to construct evidence based decision-making for Graduate Medical Education consistent with our mission • “We needed a Report Card”…

  15. SIGH…. • It wasn’t as easy as first thought!

  16. Our First Attempt …

  17. Background on InstitutionalReport Cards • Government and Industry Models • Multiple models exist and can be used as per specific purpose: • GRPA (Government Performance and Results Act) • Organizational Report Cards • Balanced Scorecard • Benchmarking • Program Evaluations • Social Indicators • No Child Left Behind • C BEST • College Performance Testing

  18. Which Model Should We Use? • We needed a model that: • was organizationally focused and managed • had a track record of effective use • fit our existing structure with multiple programs and organizations • was flexible enough to be adapted for use on an annual basis – not an accreditation cycle-Regular Data Collection • was “Easily Digestible” • had internal and external measurement dimensions

  19. Our Choice Balanced Scorecard Framework in an Organizational and/or Report Card Tool Best of Both Worlds

  20. The Balanced Scorecard Approach • The Balanced Scorecard is a performance measurement and performance management system developed by Robert Kaplan and David Norton (1992, 1996) • adopted by a wide range of leading edge organizations, both public and private. (“The Balanced Scorecard--Measures That Drive Performance,” Harvard Business Review, Jan-Feb 1992; and “The Balanced Scorecard-Translating Strategy into Action,” Harvard Business School Press, 1996)

  21. Stanford’s Report Card – Three Programs

  22. Key to the Report Card

  23. Balanced Scorecard Strategic Perspectives Resident How do our residents see us? Are our programs excelling? Institutional / Financial Growth Mission Vision Strategy Program Processes Are we putting our resources in the right places? Do we continue to improve (outcomes)? Learning

  24. Selection of Balanced Report Card Measures • GME Internal HS Survey • Recommendation of Program • GME Internal HS Survey • Overall Satisfaction • Resident Evaluation of Program • Match Depth • ACGME Survey Responses • Teaching • Supervision • Scholarship • Non Hostile/Intimidating Environment • Address Concerns Confidentially • ACGME Cycle Length • # ACGME Citations • Board Pass Rates • Career Placement • Training Exams • Grants

  25. Balanced ScoreCards “Report Cards” Had • Organizational focus • Regular Means of Data Collection • Include External Assessments as well as Internal Assessments Were • Standardized • Capable of transformation • Aligned with the Mission of Stanford Graduate Medical Education (GME)

  26. Stanford Hospital & Clinics Use of the “Balanced Report Card” • The Stanford Report Card is built on the Balanced Scorecard conceptual framework for translating an organization’s vision into a set of performance indicators distributed among four perspectives adapted for GME: • Resident Perception Measurements • Program Processes • Learning Outcomes • Financial/Growth

  27. Stanford Background Stanford University Medical Center currently sponsors 85 ACGME-accredited training programs with over 1100 enrolled residents and fellows.

  28. Stanford University Medical Center Mission • Dedication to pursuing the highest quality of patient care and graduate medical education, recognizing that one of its major responsibilities is the provision of organized educational programs. • Support of quality graduate medical education programs and excellence in residency training and research. • Guidance and supervision of the resident while facilitating the resident’s professional and personal development and ensuring safe and appropriate care for patients. • Provision of adequate funding of graduate medical education to ensure support of its faculty, residents, ancillary staff, facilities, and educational resources to achieve this important mission. • Ensuring that all of its graduate medical education programs meet or exceed the Institutional and Program Requirements promulgated by the Accreditation Council for Graduate Medical Education.

  29. Translating the Stanford Hospital & Clinics’ GME Mission to the Balanced Report Card • Resident Perception Measurements “Guidance and supervision of the resident while facilitating the resident’s professional and personal development and ensuring safe and appropriate care for patients.” 2. Program Processes “Ensuring that all of its graduate medical education programs meet or exceed the Institutional and Program Requirements promulgated by the Accreditation Council for Graduate Medical Education.”

  30. Stanford Hospital & Clinics Report Card 3. Learning Outcomes “Support of quality graduate medical education programs and excellence in residency training and research.” 4. Financial/Growth “Provision of adequate funding of graduate medical education to ensure support of its faculty, residents, ancillary staff, facilities, and educational resources to achieve this important mission.”

  31. Stanford Hospital & Clinics Balanced Report Card • Indicators are designed to measure SHC’s progress toward achieving its vision; other indicators are designed to measure the long term drivers of success. • Through the balanced report card SHC : • Monitors its current performance (finances, resident satisfaction, learning outcomes and program process results) • Monitors its efforts to improve processes, educate residents • Enhances its ability to grow, learn and improve the quality of its fellowship and residency educational programs.

  32. Why GME Thinks We Need This • ACGME and Institutions are increasingly holding DIOs and GME Committees accountable for their utilization of institutional resources. • Actions / decisions must be based on documented real-time analyses of needs.

  33. Selection of Balanced Report Card Measures • GME Internal HS Survey • Recommendation of Program • GME Internal HS Survey • Overall Satisfaction • Resident Evaluation of Program • Match Depth • ACGME Survey Responses • Teaching • Supervision • Scholarship • Non Hostile/Intimidating Environment • Address Concerns Confidentially • ACGME Cycle Length • # ACGME Citations • Board Pass Rates • Career Placement • Training Exams • Grants

  34. Findings • After several years of data collection…. • Some data is more valuable • Some data drives change

  35. VOICE OF THE RESIDENT Internal Measures External Measures Match Depth % Top Medical Schools ACGME Survey - % Compliant Responses Teaching Supervision Scholarship Non Hostile/Intimidating Environment Address Concerns Confidentially • Resident Overall Evaluation of the Program • Duty Hour Violations • GME/HS Survey - Overall Satisfaction ? • GME/HS Survey - Recommend Program ?

  36. PROGRAM PROCESSES Internal External ACGME Cycle Length # ACGME Citations • Faculty Overall Evaluation of the Program • # Internal Citations

  37. Case Study - Stanford • How the DIO uses the Balanced Report Card: • A Tale of Three Programs… • Starting from the bottom…. • Assuming that being at the “bottom” may mean “needs more help”

  38. How Do We Use this Data? • Look at Indicators that are Resident Driven – “Voice of the Resident” • Is there a discrepancy between the voice of the resident and the other indicators? • Would the majority of the residents not choose the program again yet the program receives a five year accreditation cycle with one or no citations?

  39. How Do We Use this Data? • How Do the Programs Compare Against Each Other? • How do they compare against their ACGME Cycles?

  40. Stanford’s Report Card – Three Programs

  41. What does this tell the DIO? (Program A) • Setting the stage: • 2008-ACGME gave a 5 year cycle with 1 citation • 2009-ACGME resident survey shows “tanking program” • 2009-GME House Staff survey shows “program in trouble” • Duty Hour violations are > 15 times greater than the Institutional Average • Over a three year period

  42. Stanford’s Program A

  43. Stanford’s Program A

  44. What does this tell the DIO? • Educational Milieu • Intimidation and Fear of Retaliation - Increasing • Lack of confidence in the program to confidentially deal with problems or concerns - Increasing • Overall resident satisfaction with the program – decreasing • Would recommend: extremely low in 2008/09 (lowest of any SHC program) Definitely an early warning….

  45. How Did We Use the Data • Validated data with resident interviews (single and group) You know you have a problem when the residents call and ask to meet you at Starbucks…

  46. Setting up an “Action Plan” • Met with program leadership • Shared data • Discussed “their” interpretation of the data • “Brain-stormed” with program leadership • “How can GME help you?”

  47. Program View on Situation • Need more help…more residents…more MD extenders…. • Residents are the problem…”not tough” and they “whine”….

  48. GME Evaluation of the Situations • Qualitative analysis of every comment on the internal house staff survey • Trend analysis of surveys • DIO used her training in conflict resolution • Report developed to define the problems • Shared with C-Suite

  49. Action Plan • Used data regarding duty hours with the C-Suite to: • justify hiring additional physician extenders • reconfigure rotations • Used resident program satisfaction data with the Dean to: • leverage program change and “motivate” program leadership

  50. Results - Success ? • Duty Hours have improved • Waiting on resident sat from 2012 survey (May)

More Related