1 / 9

Theme 1. Practioner/scientists dilemma

Theme 1. Practioner/scientists dilemma Determining the intervention and research design is an iterative process. Impacts on experimental control , the nature of the intervention, its feasibility ,and the costs for the intervention and evaluation

Faraday
Télécharger la présentation

Theme 1. Practioner/scientists dilemma

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Theme 1. Practioner/scientists dilemma • Determining the intervention and research design is an iterative process. Impacts on experimental control , the nature of the intervention, its feasibility ,and the costs for the intervention and evaluation • This resulting decision influences the probability of funding, treatment effect , and replication of the study by other researchers.

  2. The practioner scientist dilemma. A growing problem • The scientists • Attempts to get agreement about what constitutes evidence • Levels of evidence increasingly determined by design • The public health practioner • Difficult to achieve sustainable , worthwhile change in relevant behaviours • Consequently shift from individual , single intervention strategy to system approach • schools, school districts, police commands ,E.Ds hospitals,census districts,towns,states , SEC,indigenous groups, countries.

  3. Theme 2. Limitations of RCTs in evaluating system interventions • Difficult to recruit sufficient systems dependent upon their size and degree of collaboration ( 8-17 ) • High cost . • Due to number of units necessary for power • Often spend more on evaluation than on intervention. • Costs limits total number of studies which can be funded • Reluctance to fund similar studies. No replication • Length of time/funding may not allow a large enough effect to be observed • Inability to change intervention based on interim results • Encourages , and is appropriate for, an individual intervention approach

  4. Theme 3. Trade off • Do we fit the intervention to existing perceived goal standard research design thereby changing the nature of the intervention , decreasing potential size of outcome effect or find alternative acceptable research designs.? • One design does not fit all intervention questions. Different research designs are appropriate for answering different questions. Partly determined on the unit of study ,the feasibility and existing severity of the proposed change and available funds.

  5. Theme 4. Does the design allow a decisions about whether • Change occurred • The change is a result of the intervention not other factors • The change is significant. To whom ? • Statistically, clinical/public health, cost effective, to consumer group, policy makers, politically • May vary according to perceptions of ; burden imposed by behaviour and ease in changing

  6. Effect of Random breath testing in N.S.W Seatbelts Introduced Oct 1970 RBTs Introduced Dec 1982

  7. Comparisons of designs

  8. Theme 5. Research methodology. • Generalisability • Selection of study population • Potential bias • Consent • Attrition • Social desirability/ group effect • Measures • Which measures are used, when how robust • Treatment Issues • adherence by providers and consumer group • Contamination across groups * • Ethical replication of intervention possible • Cost effective ? * • Analysis • Mission vs. component analysis * • Sample size * • Analytical technique *

  9. Conclusions • Changing interventions strategies require design innovation by methodologist. • Design should allow credible ,accepted answer to whether change occurred, was a result of the intervention, and is significant. • Need different designs for different questions/interventions. Some of these will occur naturally. • Acceptance by funding agencies, reviewers journals of practioner/scientist evaluation model

More Related