1 / 36

Conflicts of Interest

Conflicts of Interest. Kim Walker – No conflicts of interest to disclose Ann Dohn – No conflicts of interest to disclose. SES032 Meaning beyond numbers: The power of qualitative inquiry for program assessment. Kim Walker, PhD Education Specialist Ann Dohn, MA DIO & GME Director.

norm
Télécharger la présentation

Conflicts of Interest

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Conflicts of Interest • Kim Walker – No conflicts of interest to disclose • Ann Dohn – No conflicts of interest to disclose

  2. SES032Meaning beyond numbers:The power of qualitative inquiry for program assessment Kim Walker, PhD Education Specialist Ann Dohn, MA DIO & GME Director

  3. Session Outcomes • Describe the use and strengths of qualitative inquiry in program assessment. • Describe and perform the basic steps of qualitative inquiry including: deciding on a data collection tool, coding data for emergent themes, member checking and interviewer corroboration to establish validity. • Use emergent themes and supporting qualitative and quantitative data to provide constructive and actionable feedback to program leadership. • Compare and contrast the pros and cons of three software tools designed for qualitative inquiry.

  4. Yes, Numbers Matter!Hypothetical Institutional Report Card

  5. Qualitative Inquiry: Pros and Cons • Qualitative data = “what” “why” & “how” of the resident and faculty perceptions and experiences that drive these ratings • Explore topics in more depth and detail than quantitative research • Qualitative data, while meaning-FULL, can also be unwieldy and difficult to come to any conclusion outside of “these residents nowadays are sure an entitled bunch of needy trophy kids!” • Cannot generalize your findings to a broader population

  6. Qualitative Data Mining

  7. Data Source: Surveys • Larger populations where opinions matter and research subjects are likely to respond to closed and open questions • Important to minimize survey length and questions to only latent constructs of interest

  8. Data Source: Observations

  9. Data Source: Interviews • Interactive collection of participant perspectives. Types of interviews: • Structured (only pre-set questions) • Semi-structured • In-depth and unstructured Uses: • To clarify meaning • Discuss progress/results • Support exploratory work

  10. Data Source: Focus Groups • Combines interviewing and participant observation • Uses group interaction to generate data and gain first hand insights

  11. Focus Groups vs. In-depth Interviews Use focus groups when… Use in-depth interviews when… • Group interaction may stimulate responses • Topic amenable to less input from individual respondents • Logistically feasible to assemble target respondents in one location • Quick turn around needed, funds limited • Group/peer pressure would inhibit responses • Subject matter is too sensitive for group discussion • Topic warrants greater depth of individual respondents

  12. Recap: Sources of Qualitative Data • Surveys (GME or Program Specific) • Program Assessments (faculty and residents) • Observations (resident shadowing; rounding) • Interviews (individual or small groups) • Focus groups (larger consensus groups) • Document reviews (Internal reviews) • Case studies (holistic, multiple data sources) • Internet? Message boards? Residency chat rooms? What is the word on the “streets” about our programs?

  13. Data Collected: Now What?

  14. Data Coding: Approaching your data General  Specific Deductive Inductive Observation Specific  General Plausible explanation and recommendations Abductive

  15. Coding Qualitative Data Codes are: Labels that assign symbolic meaning to the descriptive or inferential information compiled during the study Prompts or triggers for deeper reflection Developed using multiple approaches Deductive – Provisional “start list” from conceptual framework, preliminary codebook Inductive – Codes emerge progressively, no preliminary codebook Abductive – Based on early plausible explanations, combination of deductive and inductive

  16. Tools for coding: Manual

  17. Data Coding: software supported

  18. From Text to Themes • Thorough, well documented analysis • Increases opportunities for replication • Enhances credibility • Clarify if broad categories (or domains) are theoretically or logically categorized • How findings relate to your theoretical framework

  19. Mining the Data Activity:hypothetical data set

  20. Mining the Data Activity:Step 1 – Read each excerpt and highlight key words that capture the essence of the response.

  21. Mining the Data Activity:Step 2 – Summarize in few descriptive terms in “Individual Margin Coding Notes” Interns disconnected Needs: structure, organization, attention to details Higher standards

  22. Mining the Data Activity:Step 3 – Refine coding notes into themes identified across all responses/excerpts. Interns disconnected Needs: structure, organization, attention to details Higher standards Structured, detailed curriculum

  23. Mining the Data Activity:Step 4 – Focus on key overall themes from the entire qualitative assessment.

  24. Promoting Qualitative Research Validity

  25. Promoting Qualitative Research Validity

  26. What the numbers say: Not Good!

  27. GME Plan • Qualitative Analysis of: • Resident Focus Group • Three years Program and House Staff Evaluations • Internal Review Report • Summarize Findings – Focus and Prioritize on top emergent issues • Meeting to present to program leadership • Follow up: monitor subsequent year surveys; follow up focus group?

  28. From Analysis to Recommendations Program Case Study Areas of Concern “Background Noise”

  29. From Analysis to Recommendations

  30. From Analysis to Recommendations • Program Case Study: • Focused on top three emergent themes/categories of resident dissatisfaction with learning experience, supported by quantitative and qualitative data • Lack of mentoring (research and career) • Minimal faculty engagement/ camaraderie • Service over education (lack of support staff / need for additional residents)

  31. Delivering the Golden Nuggets • Provide concise, data driven findings and recommendations • Who is receiving the information? Are they in a position to take action? • Define action plans, set goals and timelines • Check back in with leadership • Follow up with monitoring subsequent surveys; focus group/interviewees

  32. From Analysis to Recommendations • Program Case Study: • Recommendations • Lack of mentoring (research and career): • Implementation of Mentoring Program • Minimal faculty engagement/camaraderie • Focus on increase faculty attendance at educational sessions (attendance tracked) • Planned offsite retreat for faculty and residents • Service over education (lack of support staff / need for additional residents) • Hire additional support staff to handle “scut” work • Program expansion for additional residents • Next Step: Follow up focus group review and 2014 survey analysis

  33. Concise, Data-Driven Summary Applying quantitative metrics to qualitative data

  34. Qualitative Software Tools

  35. Resources

  36. Questions Please feel free to contact us with any questions @ 650-723-5948 kwalker5@stanford.edu or adohn1@stanford.edu

More Related