1 / 39

MODEL ACADEMIC CURRICULUM MODULE 13

MODEL ACADEMIC CURRICULUM MODULE 13. Assessing and Evaluating Responses. Module 13 Components. Assessment and Evaluation Conducting Community Surveys. Introduction. Did the problem decline? If so, did the response cause the decline?.

enoch
Télécharger la présentation

MODEL ACADEMIC CURRICULUM MODULE 13

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MODEL ACADEMIC CURRICULUMMODULE 13 Assessing and Evaluating Responses

  2. Module 13 Components • Assessment and Evaluation • Conducting Community Surveys

  3. Introduction • Did the problem decline? • If so, did the response cause the decline? The purpose of assessing a problem-solving effort is to help you make better decisions by answering two specific questions:

  4. Assessment and Evaluation

  5. Reviewing the SARA Model • SCANNING • ANALYSIS • RESPONSE • ASSESSMENT

  6. The Role of Evaluation in Problem-Solving

  7. Types of Evaluations • Process Evaluations • Impact Evaluations

  8. Interpretation of Process and Impact Evaluations

  9. Conducting Process Evaluations • Observe response implementation • Interview relevant parties • Conduct focus groups • Conduct surveys

  10. Conducting Process Evaluations • Record decisions and progress in meeting minutes • Develop a timeline of important achievements • Document everything!!

  11. Process Evaluation Timeline

  12. Problem Solving Project - Process EvaluationCalifornia Highway Patrol Problem - Traffic Collisions • Environmental surveys - Personal inspection of 20 miles of roadway. • Key recommendation - 24-hour headlights-on policy.

  13. Conducting Impact Evaluations Measures • Quantitative Measures • Qualitative Measures • Measurement Validity • Selecting Valid Measures

  14. Criteria for Claiming Causality • There Is a Plausible Explanation of How the Response Reduces the Problem • The Response and the Level of the Problem Are Related • The Response Occurs Before the Problem Declines • There Are No Plausible Alternative Explanations

  15. Types of Evaluation Designs

  16. Pre-Post Designs

  17. Pre-Post Design with a Control Group

  18. Interrupted Time Series Designs

  19. Multiple Time Series Design

  20. Practical Limitations to Interrupted Time Series Designs • Measurement is expensive or difficult. • Data are unavailable for many periods before the response. • Decision-makers cannot wait for sufficient time to elapse after the response. • Data recording practices have changed, making inter-period comparisons invalid. • Problem events are rare for short time intervals, forcing you to use fewer, longer intervals.

  21. Combining and Selecting Designs

  22. Spatial Displacement of Crime or Disorder, and Spatial Diffusion of Crime Prevention Benefits

  23. Problem-Solving Evaluation Checklist • Early Considerations • Process Evaluation • Impact Evaluation • Evaluation Conclusions • Overall Impact Evaluation Conclusions

  24. I. Early Considerations • What will the evaluation help you decide? • Do you know the problem? • Do you know how the response works?

  25. II. Process Evaluation • Did you implement the response? • Did you implement enough of the response?

  26. III. Impact Evaluation • Do you need a control group? • How often can you measure the problem? • What type of evaluation design should you use? • What type of control group do you need?

  27. IV. Evaluation Conclusions • What are your findings from the process evaluation? • What are your findings from the impact evaluation?

  28. V. Overall Impact Evaluation Conclusions • Did the problem decline after the response? • If the problem did decline, did it do so at a faster rate after the response than before the response? • If the problem did decline, can you rule out all other plausible explanations for the decline, other than the response? Use your list of differences between the response and control groups to help answer this question.

  29. Conducting Community Surveys

  30. General Use of Surveys CJ Researchers • to better understand crime and public fear of crime Social Scientists and Political Pollsters • to learn about social relations and predict future events Government Agencies • to predict economic trends and how people react to policy Police • to measure public opinion and operational effectiveness

  31. Specific Uses for Community-based Surveys • Gather information about public attitudes regarding police or crime • Detect and analyze problems in neighborhoods • Evaluate problem solving efforts and other interventions by taking baseline measures • Control crime and fear of crime

  32. Survey Process Basic Sampling Designs Simple random sampling – ensures that everyone in the population has a chance to be included in the sampling bias Nonrandom sampling – sometimes is the best option available given the time and other limits How many will be sampled? - Sample size, sampling error

  33. Methods for Contacting Respondents • Mail surveys • Telephone surveys • In-person interviews • Internet-based surveys • Officer/Deputy or volunteer delivery • Surveys are filled out at a designated location

  34. Asking Questions • General considerations • How to ask questions • Open ended vs. close ended questions • Designing the questionnaire

  35. Analysis of the Data • Characteristics of the sample • Does the sample represent the population? • Making inferences about the population • Estimating relationships • Significance testing

More Related