1 / 50

Evidence-based Professional Development

Evidence-based Professional Development. Chris Borgmeier, Ph.D. Michelle Duda, Ph.D. Melissa Van Dyke, LCSW. 2011 SPDG Regional Meeting. A Few Questions:. What do we mean when we use the term, “professional development?” What outcomes are we attempting to achieve?

omer
Télécharger la présentation

Evidence-based Professional Development

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evidence-based Professional Development Chris Borgmeier, Ph.D. Michelle Duda, Ph.D. Melissa Van Dyke, LCSW 2011 SPDG Regional Meeting

  2. A Few Questions: • What do we mean when we use the term, “professional development?” • What outcomes are we attempting to achieve? • What do we know about the professional development strategies that are likely to achieve particular outcomes? • How can we monitor and improve the quality of professional development over time?

  3. Overcoming Barriers with Common language Common frameworks Defined approach (best practices) Common measures (commitment to continuous quality improvement) Common PD Challenges

  4. Overcoming Barriers : Common Language

  5. Training • “the process of bringing a person to an agreed standard of proficiency, by practice and instruction” http://www.thefreedictionary.com/training • “the acquisition of knowledge, skills, and competencies as a result of the teaching of vocational or practical skills and knowledge that relate to specific useful competencies.” http://en.wikipedia.org/wiki/Training

  6. Adult Learning Defined • “a collection of theories and methods for describing the conditions under which the processes of learning are optimized (Merriam, 2001; Trotter, 2006; Yang, 2003).” Trivette, Dunst, Hamby, O’Herin, 2009, p. 1

  7. Professional Development "Professional development ... goes beyond the term 'training' with its implications of learning skills, and encompasses a definition that includes formal and informal means of helping teachers not only learn new skills but also develop new insights into pedagogy and their own practice, and explore new or advanced understandings of content and resources.” Modified from http://www.ncrel.org/sdrs/areas/issues/educatrs/profdevl/pd2prof.htm

  8. Professional Development (cont.) “. . . [This] definition of professional development includes support for teachers as they encounter the challenges that come with putting into practice their evolving understandings about the use of new skills and technology.” Modified from http://www.ncrel.org/sdrs/areas/issues/educatrs/profdevl/pd2prof.htm

  9. Overcoming Barriers : Common Frameworks

  10. Implementation Science Longitudinal Studies of a Variety of Comprehensive School Reforms Aladjem & Borman, 2006; Vernez, Karam, Mariano, & DeMartini, 2006

  11. What Works Implementation Drivers Common features of successful supports to help make full and effective uses of a wide variety of innovations

  12. Why: Improved Outcomes for . . . What: Program/Initiative (set of practices) Core Implementation Components Staff capacity to support students/families with the selected practices Institutional capacity to support staff in implementing practices with fidelity Organization Drivers Competency Drivers How: Leadership Capacity to provide direction/vision of process © Fixsen & Blase, 2008

  13. Why: Improved Outcomes for . . . What: Program/Initiative (set of practices) Core Implementation Components Professional Development Institutional capacity to support staff in implementing practices with fidelity Organization Drivers Competency Drivers How: Leadership Capacity to provide direction/vision of process © Fixsen & Blase, 2008

  14. Improved Outcomes for . . . Program/Initiative (set of practices) Performance Assessment (Fidelity) Coaching Systems Intervention Training Facilitative Administration Organization Drivers Competency Drivers Integrated & Compensatory Selection Decision Support Data System Leadership © Fixsen & Blase, 2008 Adaptive Technical

  15. Why Focus First on Teachers’ Behavior? • In education, the Teacher IS THE INTERVENTION • Build supports in relationship to what the Teacher needs to be competent • Create the conditions under which teachers can do the right thing for the right reason at the right time to maximize positive results • Wide ranging inputs (individuals with all their past history and current realities) • Build alignment of implementation practices and outcomes on what needs to happen at the point of the “learning exchange”

  16. Participants of Professional Development • Think about those who are the participants in (receivers of) your professional development activities… • What features would you suggest be in place around the Competency, Organization and Leadership drivers so that professional development is provided in a “host environment” to support sustainability and promote fidelity? Improved outcomes for students Participants with improved knowledge and skills around identified practice(s)

  17. Providers of Professional Development Participants with improved knowledge and skills around identified practice(s) • Think about those who are the providers of your professional development activities… • What features would you suggest be in place around the Competency, Organization and Leadership drivers so that your providers conduct quality professional development? Professional development providers

  18. Overcoming Barriers: DefinedApproach “Best Practices”

  19. The BIG QUESTION: What outcomes are you hoping to achieve?

  20. Use training “alone” strategies if… • You want to disseminate new information • You want to create “buy in” • You want to clarify “truths” and dispel “myths” • You are talking with a broad audience

  21. Use training “alone” strategies … • To increase knowledge about the effective program or practice related to. . . • underlying theory of change • intervention components • rationales of key practices • To increase familiarity with the use of new skills • To increase awareness of what it feels like to begin to use new skills and to receive expert feedback

  22. Training “Best Practices” • Theory grounded (adult learning) • Skill-based • Data-based (pre and post testing) • Feedback to Selection and Feed Forward to Supervision • Trainers have been trained and coached

  23. Training “Best Practices” • Theory grounded (adult learning) • Skill-based • Feedback to Selection and Feed Forward to Supervision • Data-based (pre and post testing) • Trainers have been trained and coached What do we know about each of these “best practices?”

  24. Adult Learning “Best Practices” • The most effective training includes learner experiences related to planning, application, and deep understanding • Use a diverse array of Adult Learning methods (“where 5 or 6 adult learning method characteristics were used, the average effect size was almost 1.25”) • Learners need to be engaged actively in the learning process • Multiple learning experiences, large doses of learner self-assessment and reflection, instructor facilitated learner assessment • Small numbers of participants, multiple occasions Trivette, Dunst, Hamby, O’Herin, 2009, p. 10 -11

  25. Adult Learning “Best Practices” Trivette, Dunst, Hamby, O’Herin found best practices in each of the following categories: • Introducing Information • Illustrate/demonstrate • Practicing • Evaluation • Reflection • Mastery Trivette, Dunst, Hamby, O’Herin, 2009, p. 6 - 8

  26. Skill-based “Best Practices” • Behavior Rehearsals (vs. Role Plays) • Knowledgeable Feedback Providers • Practice to Criteria

  27. Data Based “Best Practices” • Develop and use pre/post tests to determine to what extent knowledge and skill levels are being improved • Outcome data collected and analyzed • Fidelity measures collected and analyzed

  28. Staff Training Collins, S. R., Brooks, L.E., Daly, D.L., Fixsen, D.L., Maloney, D.M., & Blase, K. A. (1976)

  29. Feedback “Best Practices” • Feed Forward of pre/post data to Coaches/Supervisors • Feedback of pre/post data to Selection and Recruitment

  30. Feedback “Best Practices” Of course, there are even more opportunities to benefit from feedback…

  31. Improved Outcomes for training participants Effective Training Practices Performance Assessment (Fidelity) Coaching Systems Intervention Training Facilitative Administration Organization Drivers Competency Drivers Integrated & Compensatory Selection Decision Support Data System Leadership © Fixsen & Blase, 2008 Adaptive Technical

  32. Implementation-informed PD • If Professional Development is in support of a well-defined, effective intervention. . . . . . then Professional Development includes • Training • Coaching • Performance assessment • and Selection

  33. Training, plus coaching • Some well-defined interventions (EBPs) will require additional support to teachers post-training: • Joyce and Showers (2002) data • With coaching = 95% use in classrooms • Without = 5% use in classrooms • Rogers, Wellens, & Conner (2002) data • About 10% of what is taught in business workshops actually is put into practice • Significant data across domains that feedback improves performance

  34. Coaching “Best Practices” • Design a Coaching Service Delivery Plan • Use multiple sources of information for feedback – Direct observation is critical • Provide regular feedback to all “Drivers” • Develop accountability structures for Coaching – Coach the Coaches! • Regular review of adherence to Coaching Service Delivery Plan • Multiple sources of information for supervisor feedback

  35. Performance Assessment Best Practices • Transparent Processes – Orientation • What, When, How, Why • Use of Multiple Data Sources • Context • Content • Competency • Tied to positive recognition – not used ‘punitively’

  36. Selection “Best Practices” • Job description clarity about accountability and expectations • Pre-Requisites are related to “new practices” and expectations (e.g. basic group management skills). • Interactive Interview Process: • Behavioral vignettes and Behavior Rehearsals • Assessment of ability to accept feedback • Assessment of ability to change own behavior

  37. Enhancing Professional Development • Promotes adult learning and changes adult behavior • Empowers individuals to improve their craft • Ensures that new skills are used and/or current skills are refined • Integrates selection, training, coaching, and performance assessment to promote the development of a highly effective workforce • Supports the creation of a “hospitable environment” to allow for these new behaviors to be used and improved

  38. Why: Improved Outcomes for . . . What: Program/Initiative (set of practices) Core Implementation Components Professional Development Institutional capacity to support staff in implementing practices with fidelity Organization Drivers Competency Drivers How: Leadership Capacity to provide direction/vision of process © Fixsen & Blase, 2008

  39. Discussion Question To what extent are current approaches to professional development designed to develop the necessary competence in participants to skillfully use the most effective educational approaches to accomplish the goal of improved outcomes for students?

  40. Overcoming Barriers : Common Measures “Commitment to Continuous Quality Improvement”

  41. Integrating All We Know • A well-defined and effective intervention • Clarity about the “active ingredients” • Clear criteria to assess performance (that correlates with the desired outcomes) • An effective approach to training • An effective coaching system • An effective performance assessment process • Use of selection “best practices”

  42. Improvement Cycles Plan (Operationalize) Do (Trial) Study (Assess/Review) Act (Adjust)

  43. Improving PD Effectiveness • Collect and use data (pre/post tests to determine to what extent knowledge and skill levels are being improved) • Develop the capacity of the trainers (establish clear performance criteria, based on “what works,” then select, train, coach, and assess trainers to criteria) • Improve the ways in which training participants are “prepared” for training (create readiness)

  44. What did we learn? • What do we mean when we use the term, “professional development?” • What outcomes are we attempting to achieve? • What do we know about the professional development strategies that are likely to achieve particular outcomes? • How can we monitor and improve the quality of professional development over time?

  45. For More Information Melissa Van Dyke melissa.vandyke@unc.edu Michelle Duda duda@unc.edu Chris Borgmeier cborgmei@pdx.edu

  46. For More Information State Implementation and Scaling up of Evidence-based Practices (SISEP) Dean Fixsen, Karen Blase, Rob Horner, George Sugai www.scalingup.org Concept paper Annotated bibliography Data on scaling up Scaling up Briefs

  47. www.implementationconference.org

  48. Evidence-based Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. & Wallace, F. (2005). Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Download all or part of the monograph at: http://www.fpg.unc.edu/~nirn/resources/detail.cfm?resourceID=31 Implementation Research: A Synthesis of the Literature

  49. Thank You for your Support Annie E. Casey Foundation (EBPs and cultural competence) William T. Grant Foundation (implementation literature review) Substance Abuse and Mental Health Services Administration (implementation strategies grants; national implementation awards) Centers for Disease Control & Prevention (implementation research) National Institute of Mental Health (research and training grants) Juvenile Justice and Delinquency Prevention (program development and evaluation grants Office of Special Education Programs (scaling up capacity development center) Administration for Children and Families (child welfare reform; capacity development) Duke Endowment (child welfare reform)

More Related