1 / 12

Practical Strategies for Evaluating Change Resulting From CME

Practical Strategies for Evaluating Change Resulting From CME. Van Harrison, PhD University of Michigan Michael Fordis, MD Baylor College of Medicine Jack Kues, PhD University of Cincinnati Barbara Barnes, MD University of Pittsburgh. Overview. 1. Introduction to issues (Van Harrison)

kato
Télécharger la présentation

Practical Strategies for Evaluating Change Resulting From CME

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Practical Strategies for Evaluating Change Resulting From CME Van Harrison, PhD University of Michigan Michael Fordis, MD Baylor College of Medicine Jack Kues, PhD University of Cincinnati Barbara Barnes, MD University of Pittsburgh

  2. Overview • 1. Introduction to issues (Van Harrison) 2. Multiple topic live courses (Michael Fordis) 3. Enduring materials (Jack Kues) 4. Regularly scheduled conferences (Barbara Barnes) 5. Discussion Purpose: Help clarify questions

  3. Introduction to the Issues • General considerations • • Planning evaluation studies • Outcome categories • Source of data • Design, unit of analysis, & sampling New ACCME criteria, esp. #11 • Essential areas and new criteria • Interpretations Future activities for clarification • Time frame • SACME role

  4. Planning Evaluation Studies • Clarify what you are trying to accomplish • • Audience(s) for the evaluation report • Most important questions they want answered • How will they use the information • What information is needed for their purpose Clarify specifically what you want to measure • Attitudes, knowledge, skills, competency, behavior, cost • From whose perspective (self, peer, independent source) Assess your resources • Available data sources • Other resources: funding, skilled personnel, etc. Choose your evaluation approach • Possibilities and trade-offs • Feasibility and practical constraints

  5. Outcome Categories • General categories • • Participation • Satisfaction • Knowledge, attitudes, skills • “Competence” – ability to apply • Plan to apply • Performance • Effects (patient outcomes) Cost and cost-effectiveness • Cost of measuring the outcome • Value or benefit of having the measure

  6. Sources of Data • Self-report (including reported “change”) • Attendance, satisfaction, knowledge, competence, behavior, patient outcomes Externally referenced • Attendance records • Knowledge test • Simulation (vignette, device, patient) • Records • Peers • Etc.

  7. Design, Unit of Analysis, & Sampling • Design – note re “analyzing change” • • Post intervention • Pre-post comparison (reported at post) • Randomized control • Time series Unit of analysis • Program • CME activity • Participant Sampling • Activities in program: all or some • Participants in activity: all or some

  8. New ACCME Criteria, Esp. #11 New ACCME Criteria & Medical Schools 1. The provider had a CME Mission statement that includes all of the basis components (CME purpose, content areas, target audience, type of activities, expected results) with expected results articulated in terms of changes in competence, performance or patient outcomes that will be the result of the program. 11. The provider analyzes changes in learners (competence, performance, or patient outcomes) achieved as a result of the overall program’s activities/educational interventions. • Essential Areas • • All still apply. E.g., Essential area 2.4 - evaluate each activity, no new specification regarding evaluating each.

  9. Interpretations Relevant to #11 • 1. ACCME distinguishes “evaluation” from “research.” Data need not be scientifically rigorous (e.g. high response rates), but should be reasonably useful for administrative decision-making. • 2. Self-report is an acceptable data source. • 3. ACCME has no view regarding the level of specificity needed in documenting change, e.g., for a three–day update course or a year–long weekly grand rounds series. • 4. Evaluate each activity, but analyze change in competence, performance, patient outcomes in overall program’s activities. • 5. Sampling can be used: providers within activities, activities within the program. (But all activities must be evaluated.)

  10. Interpretations Relevant to #11 • 6. What is “competence”? (Ability to perform: combination of knowledge, skills and attitudes.) “”Strategy’ – what [participants] would do if given the opportunity.” “Knowledge in action.” • Example of measure of “competence” from ACCME Toolkit: • “I will now incorporate the following new elements of [topic] into my practice: • a. Practice 1 • b. Practice 2 (“Assessing change in • c. Practice e knowledge does not assess • d. All change in competence.”) • e. None • 7. A statement of an intent to change (adopt a new practice) is not a measure of performance.

  11. Future Activities for Clarification • Time Frame – ACCME • • Still developing operational interpretations • • Begins operational instructions in July • Will make ongoing revisions based on experience SACME Role – medical schools • Examine operational issues and how to achieve • Share suggestions and concerns with ACCME • Coordinate with other groups providing feedback Examples of “big” questions • Measuring change in “competence” cheaply • Monitoring program for RSCs • Percent of activities to classify status (commendation)

  12. Now Consider Implications for Some Types of CME Activities 2. Multiple topic live courses (Michael Fordis) 3. Enduring materials (Jack Kues) 4. Regularly scheduled conferences (Barbara Barnes) 5. Discussion

More Related