1 / 48

Data Collection Methodologies

Data Collection Methodologies. Module 7. Overview. Definition of data collection Data sources Characteristics of data collection strategy Questionnaires, focus groups, interviews: pros and cons. Data Collection. Observable, measurable units of analysis

Rita
Télécharger la présentation

Data Collection Methodologies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Collection Methodologies Module 7

  2. Overview • Definition of data collection • Data sources • Characteristics of data collection strategy • Questionnaires, focus groups, interviews: pros and cons

  3. Data Collection • Observable, measurable units of analysis • Relevant to the purpose of the evaluation • Specific to the unit of analysis • Appropriate procedures and tools • Collected, recorded, analyzed, to draw conclusions

  4. Data Sources • People: program managers ,administrators, beneficiaries, donors, program staff, government officials, etc. • Documents: program strategy, annual plans, progress reports, financial reports, • Infrastructure observation: building and grounds, laboratories, program sites, • Observation of group dynamics: nature of meetings, Board meetings, etc.

  5. Tools for Data Collection • Questionnaires • Interviews • Focus groups • Observation • Performance tests

  6. Experimentation • To ensure an optimal validity of data collected, am ideal model would the experimentation approach. • Given resources constraints in many evaluations experimentation is not always feasible nor accepted • More sophisticated techniques exist to ensure validity and reliability of data

  7. Data Collection Strategy • Consider the following: • Size, geographical dispersion, literacy level of target population • Resources available for data collection (time, people, money, technology) • Sensitivity of issues to evaluate • Reliability and validity sought • Needs of the evaluation • Developing an adequate strategy is key!

  8. Questionnaires and Surveys • Printed or electronic list of questions • Distributed to a predetermined group • Completed and returned to evaluator

  9. When to use a questionnaire • Target population is literate, and large (over 200) or dispersed • Need for categorical data • Need for quantitative data and statistical analysis • You have access to people who can process and analyze this type of data • Need to examine responses by sub-groups

  10. 6 Steps To Effective Questionnaires 1. Develop the purpose 2. Draft the items 3. Sequence the items 4. Design the questionnaire 5. Pilot the questionnaire 6. Develop a strategy for data collection and analysis

  11. 1. Develop the Purpose • Specific to the information you need • Add value • Relate to the evaluation • Refined to relevant questions only

  12. 2. Draft the Items Formulate into the following item types: • Fill in the blank • Question requiring one word answer • Multiple Choice • Question with 4-8 defined answers • Comment-On • Open ended question, written response

  13. 2. Draft the Items (cont`d) • List • Requesting list of 3 to 5 short responses • Can be in order of importance • Likert Scales • Statement to be ranked for agreement • Rank • List of items to be ranked by importance

  14. 3. Sequence the Items • Group into themes, or by question type • Rewrite questions, eliminate redundancies • Where necessary, use filter questions • guide respondents to different branches • Sequence with logical progression

  15. 4. Design the Questionnaire • Group similar themes together • Organize into titled sections • Format into booklet • Use space effectively • Decide on the need for pre-coding • Include a title and introductory explanation

  16. 5. Pilot the Questionnaire • Test questionnaire with small group • Clarify wording • Identify ambiguity • Revise • Retest if necessary

  17. 6. Develop the Strategy • Select the sample • Determine distribution plan • Develop a cover letter • Prepare distribution package, and send • Monitor responses, follow-up • Enter data • Analyze data

  18. Effective Questionnaires • Gather valid information • Provide logical and organized data • Facilitate process of data entry and analysis • Simple, timely and cost efficient

  19. Interviews • Dynamic process of gathering information • Personal interaction between two individuals • Follow a guide or protocol • Require recording of responses

  20. When to use interviews • Need to have the views of key informants • Target population is small • Need for depth of information rather than breadth • You have reason to believe that people will not respond to a questionnaire • Informants are willing and available to meet

  21. 7 Steps to an Effective Interview 1. Define interview purpose 2. Draft interview questions 3. Sequence the questions 4. Consider process needs 5. Prepare introduction and closing 6. Prepare to record responses 7. Pilot test questionnaire

  22. 1. Define the Interview Purpose • Define objectives for the interview • Link objectives to the evaluation • Identify the data needed • State the purpose for each part of interview • Focus on content rather than process

  23. 2. Draft Interview Questions • Carefully drafted and worded • Minimize ambiguity • Combine open-ended with close-ended

  24. 3. Sequence the Questions • Vary question types • Open-ended (What do you think about…) • Close-ended (Did the program begin in 1998?) • Organize questions into themes or sections

  25. 4. Consider Process Needs • To manage the interview process, use: • transitions to move to each theme • reminders to paraphrase or summarize • lists of standard probes

  26. 5. Prepare Introduction and Closing • Introduction to set tone and rapport • state purpose of interview • explain who the interviewer is • confirm confidentiality and use of data • Closing to signify the end of interview • reinforce usefulness of interview • thank respondent • confirm arrangements for follow-up

  27. 6. Prepare to Record Responses • For key informant interviews: • tape record (with permission) • take brief notes on notebook • detail notes immediately after interview • For normative interviews: • check boxes • space for efficient data recording

  28. 7. Pilot-test Interview • Validate the interview: • protocol • content • flow • arrangement to record responses • If more than one interviewer, validate: • ability to use protocol

  29. Sample Protocol - Program Evaluation Interview • Introduction: “We have been asked by.. To conduct a .. We would like to discuss the ...” • Theme 1 - Program relevance • Theme 2 - Program success • Theme 3 - Program cost effectiveness • Conclusion: “In conclusion, what are the main strengths and weaknesses of this program?”

  30. Effective Interviews Are: • Planned sufficiently • clear objectives • Focused on the process • control over context and process • two-way communication • awareness of non-verbal communication • Conducive to establishing trust • Dependant on a cooperative interviewee

  31. Effective Interviewing Skills Include: • Active Listening • relaxed, eye contact, verbal feedback • Openness and Empathy • accepting what is said without judgement • Paraphrasing/Summarizing • rewording to clarify response • Controlling the Process • setting context, shaping responses

  32. Focus Groups • Group discussions of a predetermined issue • Members share common characteristics • Moderated by facilitator • Responses are recorded

  33. When to Conduct Focus Groups • You need rich description to understand client needs • Group synergy is necessary to uncover underlying feelings • You have access to a skilled facilitator • Informants are willing to speak in groups

  34. 5 Steps to Effective Focus Groups 1. Define the purpose 2. Select an appropriate sample 3. Determine date and location 4. Develop Focus Group Guide 5. Plan to record responses

  35. 1. Define the Purpose • Based on the objectives of the evaluation • Identify information needed and why • Develop the purpose of the focus group

  36. 2. Select an Appropriate Sample • Individuals who are: • informed about the topic of discussion • in the best position to share information • represent all target populations • Pre-screen participants to ensure they meet the required characteristics • Select 6 - 12 participants

  37. 3. Determine Date and Location • Make clear arrangements regarding • purpose • starting time • how long it will last • neutral convenient location • participants feeling at ease

  38. 4. Develop Focus Group Guide • Construct effective questions • content related to evaluation purpose • open-ended to facilitate discussion • limit to five or six • Sequence questions • ensure natural flow • facilitate transition from one to the next • lead logically through variety of topics

  39. 5. Plan to Record Responses • Taping gives full record of discussion • inform and attain consent in advance • Detailed notes are indispensable • taken by assistant to moderator • organize data in predefined categories • document relevant observations • avoid cues about value of responses • underline or highlight reference points

  40. Effective Moderation Includes • Active listening • Facilitation of discussion • Awareness of group dynamics • Suspension of personal bias • Alertness to time schedule • Probing for elaboration • Affirmation of responses • Respect for different perspectives

  41. Observation: when to use • When the setting is important for the evaluation • When the interaction amongst people is a key factor of the evaluation • Before developing a questionnaire, in order to get a feeling for some of the issues • To understand cross-cultural issues

  42. Steps in Observation • Define the population or setting to observe • Select representative places to visit/observe • Develop an observation schedule with the dimensions to observe • If there are many observers, train observers for consistency

  43. During Observation • Prepare for an entry phase so the observer does not impact the setting too much • Record data according to the observation dimensions selected: interaction, environment, light, smell, infrastructure, etc. • Record qualitative (how good, how bad) as well as quantitative (how many) data • Exit from the setting and record data immediately

  44. Performance Tests: When to use • When there is a need to define aptitudes or abilities • When the study requires to select good performers (as opposed to people to train for good performance) • When there is a need to diagnose performance problems

  45. Categories of performance tests • Aptitude tests • Attitude scales • Psychological tests • Performance tests

  46. Choosing a performance test • First criteria is to match the test to the purpose • Second is to match the resources (time, money, technology), context and culture to the purpose • If the purpose is to screen out the best individuals, aptitude tests make sense • If the purpose is to assess progress, pre- and post-activity tests make sense

  47. Scoring performance tests • Criterion referenced (Can a person perform according to a given standard?) • Normative (Who is the best in a group?)

  48. Summary of Methodologies

More Related