1 / 20

Our experience in monitoring and evaluating drug abuse prevention

Our experience in monitoring and evaluating drug abuse prevention. Giovanna Campello UNODC Prevention Treatment & Rehabilitation Unit CICAD VII Meeting of the Expert Group on Demand Reduction, 13-15 September 2005, Ottawa, Canada.

raphael
Télécharger la présentation

Our experience in monitoring and evaluating drug abuse prevention

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Our experience in monitoring and evaluating drug abuse prevention

  2. Giovanna Campello UNODC Prevention Treatment & Rehabilitation Unit CICAD VII Meeting of the Expert Group on Demand Reduction, 13-15 September 2005, Ottawa, Canada

  3. UNODC has carried out two kinds of work with regard to monitoring and evaluation (“m & e”) of drug abuse prevention: 1 -- Assessing the progress of member states (“MS”) in meeting the commitments they took in the Political Declaration of 1998 (including drug abuse prevention, treatment and rehabilitation 2 -- Identifying and disseminating good practices inmonitoring and evaluating drug abuse prevention activities and programmes implemented by youth- and community-based organisations.

  4. 1 -- Assessing Member States’ Drug Abuse Prevention Programmes and Activities

  5. UN Member States Report on Prevention Activities through Questionnaires (BRQs) • With regard to prevention, the Questionnaire asks: • Whether MS have implemented drug abuse prevention activities in different settings (yes/no) • If yes, whether the coverage of the activities is low/ medium/ high • Whether the activities are sensitive to gender (yes/no) • Whether they have been evaluated (yes/no)

  6. Limitations of Questionnaires • Provide the perception of Member States • Provide limited information • Only on implementation, not on impact (the questionnaire only asks whether the activities have been evaluated or not, it does not ask about the results of the evaluation) • Yes/no, low/medium/high kind of answers

  7. Still some useful indication, for example, about the evaluation of prevention

  8. How does the UNODC Questionnaire relate to other existing regional instruments for measuring the extent of prevention activities? • Questionnaire to be reviewed in October/ November 2005 in Vienna • CICAD, EMCDDA represented • Also to see how the monitoring work can continue after 2008

  9. 2 -- Monitoring and evaluation of drug abuse prevention by youth- and community-based organisations

  10. How we identify good practice • Review of the (academic) literature identifies principles and issues • Principles/ issues are discussed and enriched in meetings including youth/ prevention workers and youth from all regions • Results are also circulated and discussed with focal points in national and international agencies • Next publication: MONITORING & EVALUATION! • Next piece of work: Prevention of Amphetamine-Types Stimulants

  11. Our Publications All available on our website!www.unodc.org/youthnet

  12. Monitoring & Evaluation Definitions  Note: These are the definitions we find useful, we are aware that there are grey areas and that terminology is being used differently. • Monitoring is about implementation of activities. It takes place during and feeds into implementation. • Evaluation is about the impact of activities. It takes place ‘after’ implementation and assesses changes in the situation of the target group, including, but not limited to what was done (implementation).

  13. What (should be evaluated)? • Preventing use?  Assessing impact in terms of drug abuse prevention might be counterproductive • The activities of most organisations are too limited in the no. of risk/protective factors they address, in coverage, in intensity, in duration. • To be valid, the kind of statistical analysis required is complex and/or requires too large a sample • Change in protective factors?  Assessing impact in terms of whether the risk/ protective factor situation has changed (on the basis of evidence of link to drug abuse prevention)

  14. Example of a small youth group with the (long term) goal of decreasing the number of youth starting to use substance in their community • IDENTIFIED RISK FACTOR 1 -- Poor communication between parents and youth • (IMMEDIATE) OBJECTIVE 1 -- By the end of our project, the communication between parents and youth of our community will have improved. • INDICATORS OF ACHIEVEMENT OF OBJECTIVE 1 -- Number of meals taken together by families has increased -- Youth report better communication with their parents, including on drug abuse issues • ACTIVITIES PLANNED IN ORDER TO ACHIEVE OBJECTIVE 1 -- Parenting skill session after school once a week for two months -- Free family meals once a week -- Family picnics once a month

  15. Example (continued) • IDENTIFIED RISK FACTOR 2 -- Youth have too much time in their hands with not much to do • (IMMEDIATE) OBJECTIVE 2 -- By the end of our project, the youth of our community will be more involved in constructive activities in their free time • INDICATORS OF ACHIEVEMENT OF OBJECTIVE 2 – No. of youth involved in a constructive activity at least twice a week in their free time increased – No. of youth spending their time chatting in the street diminished • ACTIVITIES PLANNED IN ORDER TO ACHIEVE OBJECTIVE 2 -- Organise sports training including a health promotion component & participate in competitions -- Assist youth in organising or finding other activities including a health promotion component

  16. How? A couple of basic principles • (At least) collect baseline data or collect data as time goes by to show how the situation changes. • Use a variety of methods to collect your information to validate it (triangulation) • To evaluate you also need good monitoring. How can you say that what you did is effective, if you do not know what you did in the first place?

  17. How? The methods • Surveys through (self administered) questionnaires • Not easy! Especially to get the sampling right and to create a simple but effective questionnaire • Labour intensive! Testing the questionnaire, ensuring anonymity and confidentiality, analysing the replies. • Provides numbers, which people (and donors) like so much!

  18. How? The methods • Key informant interviews • Provide a series of very specific points of view (‘biased’ information) • Can give very useful insight, if the information is triangulated rigorously. • Group discussions (including Focus Group Discussions; visual techniques, e.g. mapping; drama based techniques, e.g. role playing) • Provide quickly the point of view of a group of similar people. Extrapolation is not easy, but still VERY useful insights • Need experienced facilitation and a setting that engenders trust (e.g. not in a place where adults can listen what the youth are saying)

  19. Who (should be involved)? • Staff, (young) volunteers and youth participants • To maximise the relevance of the evaluation to the organisation, they can and should be involved in the planning, undertaking analysis, and reporting. However, they will need support and/or training. • Important stakeholders (administrators in schools and in the community, health and social workers, religious leaders, donors, etc) • Not everyone needs to be involved in everything, but kept informed at crucial points, so that they can facilitate the undertaking of the evaluation (permission to access information/ youth/ stakeholders; statistical advice; etc.) • External evaluator • Evaluators lend credibility to results, but are expensive and need follow up. Hiring an evaluator should be a conscious ‘investment’ decision on the part of an organisation that wants to undertake a more complex evaluation (more for advocacy than for learning?)

  20. Your decision will depend on why you are evaluating! • Your donor told you? • Many decisions will have been taken for you. • To improve your programme? • An organisation wide reflection on which activities were implemented, the feedback of participants and some indication of impact in terms of risk and protective factors will be very useful. • To advocate among donors and the community? • Results of a self evaluation (see above) including simple data, a few interviews and focus group discussions can go a longer way than you think! • To show that your programme has a drug abuse prevention effect? • Your programme might have run for long enough, with enough coverage and intensity that you might think: yes, this is the time to invest time and money to show that we are preventing drug abuse! You will need a good external evaluator and possibly a control group.

More Related