1 / 51

Policy, Evaluation and Practice

Policy, Evaluation and Practice. Professor Roger Ellis University of Chester University of Ulster. Professor Roger Ellis OBE. Professor Roger Ellis OBE  DSc DPhil MSc BA(Hons) C Psychol AFBPS FHEA TCert Professor Emeritus in Applied Psychology University of Ulster

lefty
Télécharger la présentation

Policy, Evaluation and Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Policy, Evaluation and Practice Professor Roger Ellis University of Chester University of Ulster

  2. Professor Roger Ellis OBE • Professor Roger Ellis OBE  DSc DPhil MSc BA(Hons) C Psychol AFBPS FHEA TCert • Professor Emeritus in Applied Psychology University of Ulster • Emeritus Professor of Psychology University of Chester • Visiting Professor University of Bedfordshire • Visiting Professor Napier University • Visiting Professor Buckinghamshire New University • Visiting Professor Kent State University • Visiting Professor Hokkaido Imperial University Japan • Visiting Professor NOSM Canada • Director of SHEU International

  3. SHEU International • Social and Health Evaluation Unit (SHEU) • United Kingdom Ireland Canada New Zealand Norway Sweden Netherlands • Programme Evaluation • Evaluation of Programmes in • Education • Health Care • Social Care • Community Safety • Community Development • Regional Development

  4. SHEU International • You Innovate we Evaluate

  5. Partnership • Partnership • Programme Evaluation • Knowledge Transfer

  6. Partnership • University : Community • University of Pecs : Baranya Council • UofP : BC; SHEU

  7. Overview • Knowledge Transfer and Programme Evaluation • Programme Evaluation : Nature and Scope • Programme Evaluation: Health Warnings • Partnership

  8. Knowledge Transfer and Programme Evaluation • Knowledge Transfer from University of Pecs to Baranya Council and vice versa • Programme Evaluation as Knowledge • Transfer as Partnership : Pareto Net Gain

  9. Knowledge Transfer Partnership • KTP : Euro speak • Transfer from Higher Education to Community • Knowledge • Needs Analysis • Problem Identification • Intelligence and Innovation • Expertise for Projects • Programme Evaluation • Continuous Improvement

  10. Policy, Evaluation, and Practice • National and Regional Social Policy • Economic, Social, Educational, Health Programmes • Delivery of Programmes :Programme Practice • Impact of Programmes • Nature of Programmes • Programme Evaluation • Feedback for Practice and Policy

  11. Evaluating Evaluation: A Public Health Warning • Everyday vs Professional Evaluation • The Field of Programme Evaluation Research • Public Health Warnings for Programme Evaluation • Programme Evaluation : Baranya County Council and the University of Pecs • SHEU International

  12. Don’t Confuse Everyday and Professional Evaluation • Characteristics of Everyday Evaluation • Characteristics of Professional Evaluation • Main differences

  13. Everyday vs Professional • Places value on something/anything: thing, person, activity, etc. • Based on implicit/explicit standards • Based on liking/disliking, wanting/not wanting, suitable/unsuitable • Intuitive • Individual • Places value on a programme • Based on specified standards for outcomes & process • Based on reliable and valid data gathering • Evidence based • Takes account of everyone’s views

  14. Subjective and internal An opinion Partial One perspective Basis for individual action Objective and external A considered, explicit & transparent judgement Comprehensive Triangulated perspectives Leading to recommendations for policy, programme and practice Everyday Vs Professional

  15. Evaluation Model Policy Social Science Professional Evaluation Programmes Practice

  16. The Field of Evaluation Research • Professional Associations/Societies • Departments and Chairs • Journals • Fields of Application • Regional Development • Health • Social Services • Education • Community Safety

  17. Programme Evaluation • Programme evaluation is a systematic method for collecting, analyzing, and using information to answer questions about projects, policies and programmes • Did it work ? • What happened ? • Did it do what it said it would do ? • Did it achieve its predicted outcomes? • What happened that wasn’t expected? • How did it work? • What caused what ? • What did it cost ? • Was it cost effective ? • What did people think of it ? • Would it have happened anyway ? • What is it being compared with ?

  18. Why Programme Evaluation ? • Accountability • Feedback for Practice • Feedback for Policy • Dissemination • Justification for Funding • Publication and Publicity

  19. For Academics • DON’T DO IT !!

  20. Unless……. • You want to apply social science to the real world • You are prepared to work close to the market • You don’t mind selling your skills • You’re prepared to seek grants with about a 10% success rate • You’re prepared to be accountable for your product • You have strong social commitment • You are concerned for Regional Development Policy, Programmes and Practice

  21. Trident What do people think of the programme? Is the programme meeting it’s objectives? How is the programme working? Outcomes Process Multiple Stakeholder Perspectives

  22. Trident Outcomes Logic Model Process Reconstitutive Ethnography MSP Realistic Evaluation

  23. 21 Public Health Warnings

  24. Heed the Iron Law of Evaluation • “The expected value of any net impact assessment of any large scale social programme is zero.” (Rossi) • The more you evaluate a programme the more you will find it doesn’t work. • Overly pessimistic: use knowledge to improve • What isn’t working, what has gone wrong/what is working, how to put it right, • how to improve

  25. Bring Order to Chaos • Enthusiasm, commitment and a flurry of activity • Structure for evaluation and data gathering • Trident • Outcomes • Process • Stakeholder Views • ‘Shaping the Future’ Developing Health Service Staff: Unclear outcomes;badly managed process;no clear customers

  26. Beware of Going Native • Identification – Involvement • Programme delivery loses objectivity • Evaluator becomes provider • Speech and Language Therapy Evaluation

  27. Don’t Expect to be Welcomed • The unwanted guest at the table – forced on programme • Low priority • Evaluation never as interesting as the programme (nor should it be) • Bridesmaid never the bride • Partnership for Improvement (SSBC)

  28. Don’t Offer More Than You Can Afford • Infinite demand for a free good • Convinced contractors are insatiable • Watch the data gathering (money goes) • Harmonize with monitoring • Sharing Education Programme

  29. Know the Programme • Where does the programme start and finish • Construct a Working Model of the Programme • Define the boundaries for evaluation • Specification may be incomplete • Self Harm Intervention Teenagers

  30. Be There at the Start • Formative as well as summative • Baseline and progress measures • Continuous improvement • Sharing Education Programme

  31. Be Clear on the Outcomes • Sometimes just not there – invisible • Outcomes but no performance measures • Process as outcome • Personality Disorder Networks

  32. Beware Solo Flying • Not just one person’s view • Three heads are better than one • Triangulated evidenced judgments • More expensive but vital

  33. Don’t let the Piper call the Tune • Tell me what I want to know • Confirmation not disconfirmation • Turkeys and Christmas • Say what’s right but positively and improvement oriented • Don’t just find fault: find a solution

  34. Unearth the Theory • What do providers think is happening? • What do they think causes what? • Test hypotheses from theory made explicit • Hungarian Social Work placement

  35. Watch the Time • Time scale for outcomes • Immediate, short term, medium term and long term • Shared Education Programme

  36. No Replication without Recollection • Capture the process • Recipe for replication • Avoid the vaguely significant and the specifically irrelevant (Bannister) • Reflections of practitioners – reconstitutive ethnography • Clinical Facilitator process bestseller unfortunately it was free

  37. Don’t Believe All They Promise You • The final report sweetener • Funding streams dry up • Fashion and fluctuations

  38. Beware Specialists Bearing Gifts • Evaluator and specialist • Face validity • Axes to grind

  39. Don’t be tempted to manage the programme • Evaluation Steering Group meeting place for providers • Evaluation, recommendations involved in action may become management • Hungarian Placement – only place • Evaluators become providers – more ideas than them

  40. If you offend do it with good will • Rigorous but sympathetic • Positive and diplomatic • Shared understanding improvement oriented • Evaluator as counselor and support • Chester Community Safety Centre

  41. Look around you • 360 degree perspectives • Recipients, providers, associated providers, commissioners, funders and managers • Complementary to statistical outcomes • Phenomenology of programme • Anti Social Behaviour Order

  42. Recommend, Recommend, Recommend • Evaluation lives on through recommendations • Policy • Programme • Practice • Recommendations to those who can implement • Village Hall Utilisation

  43. Expect Indifference But Hope for Action • Reports easily marginalized • Original purpose forgotten • We don’t believe you or We knew already • Evidenced recommendations

  44. Evaluation May Not be Research • No one cares what happens in Pecs on a Thursday • Findings should have general theoretical , empirical , methodological applicability providing evidence is valid and reliable • Genuine innovation • Greatest interest in approach & method rather than findings.

  45. Programme Evaluation UofP: BC Needs Analysis Programme Development Programme Evaluation Continuous Improvement Pareto Net Gain

  46. Programme Evaluation :Baranya County Council and University of Pecs • Programme Evaluation as Knowledge Transfer • Partnership between Council and University • Project based • Programme Evaluations

  47. Programme Evaluation and University of Pecs • Applied Social Science • Internal programme evaluation • Institutional Research • Programme Evaluation in Local Community • PE in Region • National PE • International PE : Comparative Studies

  48. SHEU International • Social and Health Evaluation Unit • SHEU International :United Kingdom Ireland Canada New Zealand Norway Sweden Netherlands • Hungary ???

  49. Slogans • No Innovation without Evaluation ! • You innovate :we evaluate • No programme too small for evaluation

  50. Working with SHEU • Free consultation on possible Programme Evaluation • Programme Evaluation contracts • Proven Trident Method • Wide range of Programme Applicability • Evaluation support • Evaluation training

More Related