1 / 72

Evaluating CCTV and Recording of Child Interviews and Testimony

Evaluating CCTV and Recording of Child Interviews and Testimony. M. Christine Kenty, PhD Sharon Elstein, MS ABA Center on Children and the Law. KENTY'S CLUES FOR EVALUATION. #1. Base any evaluation on your own goal pathway and logic model.

travis
Télécharger la présentation

Evaluating CCTV and Recording of Child Interviews and Testimony

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating CCTV and Recording of Child Interviews and Testimony M. Christine Kenty, PhD Sharon Elstein, MS ABA Center on Children and the Law

  2. KENTY'S CLUES FOR EVALUATION • #1. Base any evaluation on your own goal pathway and logic model. • #2. Organizations are social systems with their own cultures. • #3. Whatever people don't want you to study, that's the really important thing.

  3. And those touchy things are likely to be: • Collaboration • Quality of Forensic Interviewing • Decision-making about arrest, prosecution and child protection • Relating to victims and families • Children’s experiences before and in the grand jury or courtroom

  4. Kenty’s Clues • #4. It's trouble if only one person is doing all the thinking about evaluation. • #5. You can't keep partners and stakeholders too well informed about the evaluation process.

  5. Kenty’s Clues • #6. Any evaluation (or program report) requires DATA, so decide early what you need to record and then keep it up. • #7. For any evaluation, there are many good designs, but no perfect ones. • #8. Don’t kill the messenger if you don't like the news!

  6. Evaluation… is a way to improve a program by systematically examining and analyzing what the program is doing and what it has accomplished.

  7. The “I’ll know it when I see it” Rule I’ll know that our program is working when I see……….

  8. What evidence can convince us and others that our program is on target? What would tell us that something has happened? How can we count it or track it?

  9. What evaluation can do: • Help improve the program from the beginning • Provide staff and stakeholders with a much-needed sense of accomplishment • Guide protocol, policy and law reform • Assist in developing future funding

  10. Don’t put the evaluation in the hands of just one individual Whether that is an internal or external evaluator

  11. A healthy organization needs to know the program mission, plan the work, develop enthusiasm, and bring things to fruition.

  12. An organization also needs to look at what it’s doing, keep what's good and try to jettison what isn't working.

  13. Evaluation isn't a separate topic…. it's just one more piece of the work

  14. Forces will try to marginalize and minimize an evaluation Don’t let that happen – establish a strong committee!

  15. Base the evaluation on your agency pathway. “Logic model” and “pathway map” are popular phrases with funders.

  16. Logic Model or Pathway MapEach part should logically follow from the last

  17. We all work on underlying assumptions, which might be --- • Better technology will improve dispositions • Technology will make things easier for kids – fewer interviews, less testifying… • Our forensic interviewing will stand up to scrutiny • The defense bar will not limit the potential of recorded testimony • We will know if this is working well TALK ABOUT AND CLARIFY THESE!

  18. Logic Model or Pathway Map

  19. You may do a needs assessment to describe your context A needs assessment is a systematic way to discover: • what you “need” in order to accomplish a goal, and then • make decisions based on that assessment

  20. Typical needs assessments • Estimate how many clients/professionals will participate in a new program • Determine what resources are already in place and what has to be put into place • Decide what an agency or community needs to provide to get a particular result • Envision how technology and products will be used so that the equipment will be right • Decide what training people need

  21. You may have already done one kind of needs assessment but you may still want to do another piece as you begin to implement your program

  22. 1. Estimate how many clients/professionals will be involved • Count # of allegations, investigations, interviews, arrests, prosecutions, hearings, dispositions in the last year • List all the professionals who will need to be trained or familiarized

  23. 2. Determine what resources are in place and needed • All professionals and what they do • Adequacy of infrastructure: rooms, technology, wiring, lighting • Relevant state statutes re victims and CCTV and/or recorded testimony • Policies and procedures for interagency work • Other available funding

  24. 3. Decide what is needed for specific result(These are suggestions, not requirements) • What equipment, facilities, personnel, time, training, systems, policies, statutes do we need, for example: • To videotape all child interviews for children aged 3-13? • To decrease the number of child victims who testify in person? • To improve successful disposition rates? • To make the system more child-friendly?

  25. 4. Envision how equipment/product will be used • Stationary or portable • Professional technician or multiple users • Video all children or selected cases or ages • CCTV according to statutes • Show videos in what rooms to what audiences • Who needs a cut-off switch • Maintenance, upkeep costs, back-ups

  26. 5.Determine what training people will need • Technical skill in equipment use and maintenance • Scheduling and informing children/caregivers • Interviewing skills • Judiciary, Prosecutors and Bar • Permissibility/use of recording and CCTV according to all statutes • Awareness of capacity • Forensic use of recordings at multiple points • Quality assessment and record-keeping for recorders and prosecutors

  27. Design method of data collection

  28. Design methods of data collection

  29. Design methods of data collection

  30. Logic Model or Pathway Map

  31. Logic Model or Pathway Map

  32. Logic Model or Pathway Map

  33. Logic Model or Pathway Map

  34. Ideas for short term outcomes • Plans are complete; we know who contributed and what went into planning • Stakeholders are aware and committed • Equipment is acquired and installed

  35. Logic Model or Pathway Map

  36. Ideas for medium term results • Equipment being used regularly • People skilled and knowledgeable • Stakeholders have assessed use • Data being collected on interviewing, use of recordings and CCTV, and can be summed • Data establishing a baseline for later comparison

  37. Logic Model or Pathway Map

  38. Ideas for long term results (impact) • Quality of forensic interviewing and recording is consistently high • Prosecutors use recording and CCTV regularly and effectively in known ways • Prosecution rates and/or case dispositions have been improved • Stakeholders, clients and families are satisfied with the process and use of recordings • Laws and policies have been changed

  39. TYPES OF EVALUATIONS NEEDS ASSESSMENT A systematic way to discover what you need to accomplish a goal Helps make informed planning decisions

  40. TYPES OF EVALUATIONS FORMATIVE EVALUATION Short-term initial feedback on how the program is working Helps quickly readjust planned activities to be more effective

  41. TYPES OF EVALUATIONS PROCESS EVALUATION Describes how something happened rather than outcomes To understand the internal dynamics of organizations and relationships, and capture what activities are actually happening.

  42. TYPES OF EVALUATIONS IMPACT or OUTCOME EVAL determines whether a program produced desired results. Requires articulated outcomes and targets and they must be measurable.

  43. QUALITATIVE METHODS QUANTITATIVE METHODS RESEARCH METHODS

  44. QUANTITATIVE METHODS: numerical research by collecting data about pre-selected variables, and studying cause and effect QUALITATIVE METHODS: naturalistic research by studying participants' perceptions and experiences in context and the way they make sense of them RESEARCH METHODS

  45. QUANTITATIVE METHODS • surveys with pre-determined categories and rating scales • document review – e.g. counting up numbers of arrests or prosecutions, and comparing them to other groups or time periods • evaluator attempts to keep at an objective distance from the people

  46. QUALITATIVE METHODS • surveys with open-ended questions • interviews – semi-structured • observation • document review – e.g. process, attitudes • case studies • focus groups • evaluator gets close to the people to capture what is actually happening

  47. Collect both quantitative and qualitative data - they are equally valid Qualitative data help capture changes in processes and relationships, and some things just aren’t countable.

  48. Sampling: how many records or which people to ask More important to have a representative sample than a large sample

  49. But, every professional might need to be heard so that no one feels left out and there is no suspicion of bias So sampling may not be acceptable

  50. Shaping evaluation questions

More Related