1 / 85

Evidence-based Practice for Applied Behavior Analysts: Necessary or Redundant

Evidence-based Practice for Applied Behavior Analysts: Necessary or Redundant. Ronnie Detrich, Wing Institute Tim Slocum, Utah State University Teri Lewis, Oregon State University Trina Spencer, Northern Arizona University Susan Wilczynski , Ball State University. Goals for Today.

van
Télécharger la présentation

Evidence-based Practice for Applied Behavior Analysts: Necessary or Redundant

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evidence-based Practice for Applied Behavior Analysts: Necessary or Redundant Ronnie Detrich, Wing Institute Tim Slocum, Utah State University Teri Lewis, Oregon State University Trina Spencer, Northern Arizona University Susan Wilczynski, Ball State University

  2. Goals for Today • Describe basic concepts of evidence-based practice. • Describe integrated decision making framework.

  3. Two Ways of Thinking about EBP • An intervention found to have strong research support. (Cook, Tankersley, & Landrum, 2009) • Decision making process that informs all professional decisions. (Sakett, Straus, Richardson, Rosenberg, & Haynes, 2000)

  4. Two Ways of Thinking about EBP • Using same term (EBP) to describe two different constructs creates confusion: • Empirically supported treatments (EST)-interventions that meet defined standards of quality and quantity. • Evidence-based practice-decision making process.

  5. Evidence-based Practice and Applied Behavior Analysis • EBP is framework for guiding decisions of practitioners. • Decisions are based on the integration of: • Best available evidence • Client values and context • Professional judgment • Consistent with foundational principles of applied behavior analysis: • Data-based decision making • Consideration of client values • Considerations of contextual fit • Commitment to research-based treatment

  6. Evidence-based Practice and the Research-Practice Gap • Across disciplines, great concern about the discrepancy between what is known from research about effective treatments and the interventions practitioners routinely employ. • EBP is basis for closing the gap.

  7. How Does EBP Narrow the Research to Practice Gap? • Practitioners must make decisions “now” even when research evidence is absent or incomplete. • What is to be the basis for decisions? • Decisions informed by best available evidence allows practitioners to: • Select • Adapt to fit local circumstances • Modify on the basis of progress monitoring data • These decisions require professional judgment.

  8. Dilemma for Practitioners • Practitioners must make many decisions daily. • EBP assumes there is evidence for all decisions and that the relevant evidence is accessible. • How do practitioners incorporate evidence into all decisions?

  9. The challenge of Best available Evidence

  10. Goal • The best available evidence should pervade the practice of Applied Behavior Analysis. • What is the “Best Available Evidence” for ABA practice? • How do we systematically identify it?

  11. Best Available Evidence • What do we mean by “best”? • Quality: research methods and outcomes • Relevance: close match with our applied question in terms of: • Participants • Treatment • Outcomes • Context • Amount: number of participants, studies, investigators

  12. Best available evidence Empirically Supported Treatments High Quality Better evidence Low Low High Relevance (P, T, O, C)

  13. Best Available Evidence • What do we mean by “best available”? • We should use the best of what is available, • This may mean using extremely high quality evidence • Or it may mean using evidence that is less certain. • “Unlimited skepticism is equally the child of imbecility as implicit credulity.” Dugald Stewart

  14. Best available evidence High Quality Low Low High Relevance (P, T, O, C)

  15. Best available evidence High Empirically-Supported Treatment Need to Generalize Uncertainty Quality Lower Quality EvidenceUncertainty Low Low High Relevance (P, T, O, C)

  16. The practical problem • Practitioners must often make decisions with insufficient empirical support. • What are they to do? • Make the best possible inferences from imperfect evidence? • Make decisions without using systematic evidence?

  17. The practical problem • If Evidence-Based Practice of ABA is to be a pervasive model for professional decision-making… then we need ways to identify the best available evidence when the evidence is imperfect.

  18. Accessing the Best Available Evidence • Systematic reviews – The foundation • Identifies empirically supported treatments • Alternative types of review • Improve our ability to glean recommendations from imperfect literature • Other units of practice - beyond the package • Using what we know about intervention elements and principles • Progress Monitoring • The best evidence about what works for this particular case

  19. 1. Systematic reviews

  20. Systematic Reviews • Systematic EBP review (e.g., WWC, BEE, NAC) • Establish standards for: • Identifying research base • Participants • Interventions • Comparisons • Outcomes • Settings • Quality of evidence • Quantity of evidence • Unit typically limited to “programs” (treatment packages)

  21. Systematic Reviews Limitations Strengths • Reduced bias • Transparency • Objectivity • Rigorous methods • Reduced risk of false positives • Exclusive reliance on high quality evidence • Often fail to identify sufficient evidence • Not informed by lower quality evidence • Higher risk of false negative.

  22. 2. Alternative types of reviews

  23. a. What Works Practice Guides • Practice guide system: • Provide: • expert recommendations and • specific level of supporting evidence. • Allows broader generalization from specific studies • Allows for expert recommendations on topics where evidence is spare

  24. a. What Works Practice Guides Limitations Strengths • Expert interpretation of research • Includes “level of evidence” ratings – based on systematic review. • Expert interpretation may introduce bias & increase uncertainty

  25. b. Best Practice Panels • Best practice panel • Panel selection • Organization selects “expert” panelists • Panel can be broadly or narrowly constructed • Key to validity • Face validity • Functional aspects of validity • Recommendations based on group consensus

  26. b. Best Practice Panels Limitations Strengths • May include diverse perspectives: • Researcher • Practitioner • Consumer • Allows for interpretation of research • Tend to include factors other than research • social validity? • contextual fit? • May lack transparency: • Selection of panel • Criteria for “best practice”: politics or science? • Tend to include factors other than research • Bias?

  27. c. Narrative Reviews • Narrative review • E.g., APA Handbook of Behavior Analysis;NASP Best Practices; • Experts review research base to establish recommendations • Allows experts to draw on wide range of evidence • Allows for expert interpretation of patterns of relevant findings

  28. c. Narrative Reviews Limitations Strengths • Allows for broad generalization from specific studies to implications for practice. • Can incorporate many sources of evidence and logic/theory • No methodology to reduce bias. • Selecting experts • Relevant research base • Criteria for “best practice” • Strength of evidence rating

  29. 3. Other units of practice

  30. What is a “treatment”? • The best available evidence can validate: • Comprehensive schoolwide systems • Multi-component instructional or behavior packages • Specific components/elements/kernels • Principles of behavior and learning

  31. a. Practice Elements and Kernels • Examples • Differential praise, self-monitoring, frequent student responses with corrections • How they are validated • Kernels • Kernels are implemented, outcomes are measured. • Practice elements • Effective treatment packages are analyzed. • Practice elements are components included in most effective packages.

  32. a. Practice Elements & Kernels Limitations Strengths • More flexible than multi-component programs. • May be used to assemble custom interventions directed at specific problems • Can supplement information on multi-component packages • A set of effective components may not produce an effective package. • Assembling package is time-consuming and requires high level of skill.

  33. b. Principles of Learning and Behavior • Examples • Differential reinforcement, extinction • Principles of using examples and non-examples • How they are validated • Numerous studies across wide variety of populations and contexts.

  34. b. Principles of Learning and Behavior Limitations Strengths • The most basic and flexible guides to intervention. • Apply to huge range of educational problems including modifications & adaptations. • Can supplement evidence on packages and components • Do not provide specific interventions. Principles must be applied – this process is uncertain. • Application requires extremely high level of expertise.

  35. 4. Progress monitoring

  36. 4. Progress Monitoring • Progress monitoring can validate: • The specific treatment (as modified) • with the specific client • on the specific outcomes • in the specific context • No other evidence can substitute.

  37. 4. Progress Monitoring Strengths • The best evidence on whether this particular program is working. • Provides basis for data based decision making Limitations • Does not help: • initial selection of treatments • select modifications

  38. Conclusion • The best available evidence to support pervasive evidence-based practice can be derived from: • Empirically supported treatments • Evidence from alternative types of reviews • Evidence on alternative units of practice • And should always include progress monitoring.

  39. Client Values and Context Teri Lewis Oregon State University

  40. Beyond EST • Even when there are EST, practitioners may not choose these interventions and/or implement them when they are recommended. • In education innovations come and go in 18-48 months (Latham, 1988).

  41. So, if we have the right answers why aren’t EST adopted and sustained? • Face Validity • Client-values and context need to be included and respected

  42. Historical Perspective • Baer, Wolf & Risley (1968) • Immediate and important change in behavior that has practical value as determined “…by the interest which society shows in the problems.” (p. 92) • Wolf (1978) • Social significance of the goals. Societal value • Social appropriateness of the procedures. Treatment acceptability • Social importance of the effects. Consumer satisfaction

  43. “Client” • Individual(s) who are invested in outcomes and/or are critical to the behavior change process (e.g., Baer et al, 1968; Heward et al, 2005) • Individual who is the focus of the behavior change • Parents and family members, teachers, mentors, colleagues, employers • Organization, society

  44. In contrast to EBP, BACB (2010) relies on a narrow definition • Individual receiving services from a behavior analyst • Decision about who is a client based on: • Ethics • Acceptability & Fidelity • Effectiveness • Maintenance and sustainability

  45. Context • Just as we focus on behavior within an environmental context , we need to consider the context for selection and implementation of interventions • Contextual-fit (e.g., Albin et al 1996) • Values, skills, goals and stressors of the implementors and those impacted by the target behavior

  46. Implementation and Acceptability • Motivating operations such as values, goals and stressors • Compatibility with other aspects of the context (e.g., routines, resources, policies) • Reinforcement and punishment associated with implementation • Clients ultimately decide whether the intervention effects are judged successful(Wolf, 1978)

  47. Strain, Barton & Dunlap (2012) assert that incorporating client-values successfully informs decision-making: • Design of service delivery • Technical support to key implementors • When to fade intervention components • Identification of unanticipated events • Focus of future research needs

  48. Ethical Perspective • Three basic and fundamental questions (Cooper, Heron & Heward, 2007) • What does it mean to be a behavior analyst? • What is the right thing to do? • What is worth doing? • Social Validity • Client Values

  49. Summary • Including client values into assessment, intervention selection and implementation • Respects all individuals perspectives • Increases acceptability • Improves decision-making • Increases fidelity and sustainability

  50. Specifically, client values inform selection of treatment and the methods of treatment • Art of behavioral interventions

More Related