1 / 17

David Evans, Vito Laterza & Rosie Davies on behalf of the UWE/Coventry team

Public involvement in research: assessing impact through a realist evaluation invoNET 21 February 2012. David Evans, Vito Laterza & Rosie Davies on behalf of the UWE/Coventry team. Acknowledgements.

mab
Télécharger la présentation

David Evans, Vito Laterza & Rosie Davies on behalf of the UWE/Coventry team

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Public involvement in research: assessing impact through a realist evaluationinvoNET 21 February 2012 David Evans, Vito Laterza & Rosie Davies on behalf of the UWE/Coventry team

  2. Acknowledgements This project was funded by the National Institute for Health Research (NIHR) Health Services & Delivery Research programme (project number 10/2001/41). The views expressed are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health.

  3. UWE/Coventry team • Professor David Evans (principal investigator), UWE • Professor Jane Coad, Coventry • Dr Jane Dalrymple, UWE • Ms Rosie Davies, research partner • Ms Chris Donald, research partner • Professor Sarah Hewlett, UWE • Mr Vito Laterza, UWE • Dr Amanda Longley, UWE • Professor Pam Moule, UWE • Dr Katherine Pollard, UWE • Dr Jane Powell, UWE • Ms Ruth Sayers, research partner • Ms Cathy Rice, research partner

  4. Focus of this session • Initial thoughts on reflexivity • Background to our project, design and conceptual framework • Reflections on early learning • Theory development and testing • Objectivity and contamination • Working with research partners

  5. Reflexivity and reflective practice • “Reflexivity” – not the same as reflective practice: • Thinking about ourselves (i.e. reflective practice) and learning from that (i.e. do things better with reflective practice) • But also linking ourselves to our research participants (not just what you can do better, but thinking about others through our own experience) • Thinking about thinking (i.e. epistemology): reflecting upon basic structures and processes that are behind the taken-for-granted everyday reality we live in and study in our research contexts

  6. Background • Team based at University of the West of England (UWE) and University of Coventry • Grew out of existing UWE Service User and Carer Involvement in Research initiative • Nine academic researcher and four research partner (user) co-applicants/co-researchers

  7. Design • Realist evaluation framework • 18 month project • Eight case studies • Mainly qualitative methods • Semi-structured interviews (c. 5 participants per case study x 3 interviews over one year) • Observation • Documentary analysis • Consensus workshops • Economic analysis

  8. Realist evaluation • Policy driven by an underlying theory of how an initiative is supposed to work • Role of the evaluator to compare theory and practice • “What works for whom in what circumstances and in what respects?” • Look for regularities of context, mechanism and outcome (CMO) (Pawson 2006; Pawson & Tilley 1997; 2008)

  9. Levels of public involvement in research theory • Policy level – what do DH, NIHR and other senior R&D stakeholders think are effective mechanisms leading to desired policy outcomes? • Programme/project level – what do stakeholders (e.g. PI’s, research teams, research partners) think involvement contributes to their desired outcomes? • Academic level – what are the dominant academic theories in the literature about public involvement mechanisms in research and whether/how they work? • Our research team – what do we think are the context-specific and generalisable mechanisms leading to desired policy outcomes?

  10. Our CMO theory – to date

  11. Our task – articulate and test public involvement in research theory • Articulate policy-level programme theory from policy documents, actions, etc – it’s about research quality not empowerment • Synthesise what we know about context, mechanisms and outcomes in practice from the literature – recognising complexity and uncertainty • Simplify and express theory in a testable form for testing in case studies • Collect case study data and analyse • Revise theory and repeat process

  12. Reflections on theory development and testing • Difficulty of categorising factors as context or mechanism • Multiple contextual factors and mechanisms making causal attribution difficult • Time period for data collection shorter than project timescales • Impact may be diffuse not specific • Outcomes may be quite limited

  13. ‘Objectivity’ and ‘contamination’ At this stage, three related reflexive effects emerged: • Leading members of the team are also experts who provide advice and training on public involvement in research to researchers and PIs(i.e. “now that you study us, can we still ask for advice?”) • The questions we are asking are triggering processes of reflection that are likely to have an impact on the ongoing processes of shaping public involvement structures and mechanisms (i.e. “Mmm that’s a good question, I haven’t thought about that” or “I will certainly consider these issues further”) • Varying responses on potential issues of overlap (or “contamination”) from the participants themselves (i.e. some do not seem to be particularly concerned, while others are more concerned about keeping our study process from influencing the ongoing process of public involvement under study)

  14. Reflections on ‘objectivity’ and ‘contamination’ Implications for questions of ‘objectivity’ in qualitative health research: • Reflexive effects need to be taken into account and productively explored as data, rather than discarded as “unwanted bias”: this is why we decided to keep reflection going, rather than have a unilateral policy on this (i.e. we will continue to give advice and we are also aware that participants might change their behaviour due to the questions and reflections emerged from data collection; these effects will be followed up, where possible) • We will respect research participants’ wishes on the matter: if they want us to put specific measures in place to reduce any possible influence, we aim to accommodate that • We will protect research participants’ confidentiality and anonymity in all cases, and this will always take precedence on questions of reflexivity and objectivity in any case.

  15. Involving research partners • Contributed from design stage onwards • Each research partner works with one academic researcher and Vito on two of the eight case studies • Research partners meet together as a group in addition to attending full team meetings • Involvement in all aspects of the project, including theory building, conducting interviews and analysis • Reflection on the team’s process of involvement, issues and outcomes included as data • Support from Vito and named researcher on team

  16. Reflections on involving research partners • Role development in the institution at UWE • Extending formal arrangements because of our need for research passports • Difficult to keep track of impact! • Differences: levels of experience in research, life experiences, kinds of contributions, between case study teams ... • Complex roles and relationships

  17. Contact details • David Evans, Professor in Health Services Research (Public Involvement) David9.Evans@uwe.ac.uk • Vito Laterza, Research Fellow Vito.Laterza@uwe.ac.uk • Rosemary Davies, Research Partner Rosemary3.Davies@uwe.ac.uk

More Related