1 / 56

DESIGNING EVALUATION INSTRUMENTS

DESIGNING EVALUATION INSTRUMENTS. Class X AEE 577. Upon completion of this lesson, students should be able to: List the step by step procedures for developing quality evaluation instruments; Describe the errors that must be controlled in evaluation instruments;

mjanes
Télécharger la présentation

DESIGNING EVALUATION INSTRUMENTS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DESIGNING EVALUATION INSTRUMENTS Class X AEE 577

  2. Upon completion of this lesson, students should be able to: • List the step by step procedures for developing quality evaluation instruments; • Describe the errors that must be controlled in evaluation instruments; • Develop different forms of questions to record outcomes such as change in knowledge, attitudes, skills, aspirations, and behaviors; • Write process evaluation questions; • Describe reliability and validity • Identify double barreled questions; and • Develop an evaluation instrument.

  3. How to Design Your Data Collecting Instrument? • Where to begin?

  4. Begin with the information needs of key stakeholders • Information needs for program improvement • Information needs for accountability

  5. Designing InstrumentsStep 1: Identify the type of data and information you need to collect. • Focus on the information needs of key stakeholders. • Clearly identify what data and information are needed to collect for this purpose. • Identify the major categories of information that you need to collect. • List subcategories of information under major categories.

  6. Designing InstrumentsStep 2: Develop the “Sketch” of Your Instrument. • List the major items in your instrument to structure it. • Organize the structure of your instrument to collect needed data. • Organize subcategories under each major topic. • Include the demographic data collection section at the very end of the instrument.

  7. Designing InstrumentsStep 3: Identify Necessary Scales and Questions. • Determine the types of scales you need to include in your instrument. • Determine the types of questions you need to ask.

  8. Designing InstrumentsStep 4: Be Consistent in Numbering Answer Choices and Scales • It is a good idea to use low numbers for lower manifestation of a measuring variable. • Example: • High school diploma • Bachelor’s degree • Master’s degree • Doctorate • By using a consistent pattern throughout the instrument you can easily interpret results.

  9. Designing InstrumentsStep 5: Writing Questions • As a general rule, when writing questions, you must ask “why am I asking this question?” • Remember your evaluation information needs always. • Think about the answer before you write any question. • There are two ways to write a question • Open-ended • Example: What methods do you use to educate farmers on sustainable agriculture? • Closed ended • Example: What methods do you use to educate farmers on sustainable agriculture? • Field days • Workshops • Seminars • Printed materials • Electronic materials • Others (please specify)___________________

  10. Designing InstrumentsWriting Open-Ended Questions • Things to remember when writing questions: • Write questions clearly and concisely. • Start with least sensitive or non-threatening questions. • Write questions by thinking about the reading level of the target population. • Avoid double negatives. • Avoid double-barreled questions. • Example: Are you satisfied with the place and time of the program?

  11. Designing InstrumentsWriting Open-Ended Questions • Open-ended questions are useful to explore a topic in depth. • However, open-ended questions are difficult to: • Respond • Analyze • Therefore, limit the number of open-ended questions to the needed minimum. • When you need to ask a sensitive question it is appropriate to use a closed-ended question with response categories for the sensitive information. • Example: Asking income or age (Ask what is your age group and provide age categories instead of asking how old are you?)

  12. Designing InstrumentsWriting Closed-Ended Questions • When writing closed-ended questions: • Make sure to include all possible response categories. • If you have not included all possible answer categories, it is a good idea to include a category called ‘Other’ and provide instruction to specify what the respondent means under this category. • Make sure that your answer categories are mutually exclusive. • Example: What is your age group? • Less than 20 years • 20-30 years • 31-40 years • 41-50 years • Above 50 years

  13. Designing InstrumentsWriting Closed-Ended Questions • Closed-ended questions are: • Easy to analyze. • Not exploratory in terms of searching information.

  14. Scale Development • Develop scales if you need to include in your instrument.

  15. Guidelines For Scale Development • Scales are developed for measuring elusive phenomena that cannot be observed directly. Example: Attitudes, Aspirations. • Therefore, scale development should be based on the theories related to the phenomenon to be measured. • Thinking clearly about the content of a scale requires thinking clearly about the construct being measured.

  16. Guidelines For Scale DevelopmentGenerate an Item Pool • The properties of a scale are determined by the items that make it up. • At this stage, you need to develop more items than you plan to include in the final scale.

  17. Characteristics of Good Items • Unambiguous. • Avoid exceptionally lengthy items. • Consider reading levels of the target respondents. • Include positively and negatively worded items. The purpose of wording items both positively and negatively within the same scale is usually to avoid acquiescence, affirmation, or agreement bias.

  18. Guidelines For Scale DevelopmentDetermine the Format for Measurement • There are different formats • Identify the format you would like to use with your items. • Determine how many response categories you need to include in your format.

  19. Guidelines For Scale DevelopmentDetermine the Format for Measurement • The number of response categories should be limited to the respondents’ ability to discriminate meaningfully. • Normally 5-7 response categories are adequate for extension and education program evaluations. • Example: • Strongly disagree • Disagree • Neutral • Agree • Strongly agree

  20. Guidelines For Scale DevelopmentLikert Scale • Named after Rensis Likert. • This is the most common format • The response options should be worded so as to have roughly equal intervals with respect to agreement. That is to say the difference in agreement between any adjacent pair of responses should be about the same as for any other adjacent pair of response options. • Common choices for a mid point include neither agree nor disagree and Neutral.

  21. Guidelines for Scale DevelopmentLikert Scale Example for items in Likert format • Strongly Disagree • Disagree • Neutral • Agree • Strongly Agree

  22. Guidelines For Scale Development:Semantic Differential Scaling • There are several numbers between the adjectives that constitute the response options. • Example: The quality of training session Poor 1 2 3 4 5 6 7 Excellent

  23. How do you feel now about your current position? Rate your feelings on the scale below. Example For a Scale(Recording Anxiety)

  24. Instrument DevelopmentStep 6 Provide Necessary Instructions to Complete the Survey • Clear instruction is essential to facilitate the responding process. • Instructions should be clearly and politely stated. • Clear instructions increase your return rate as well as accuracy of your data.

  25. When You Develop a Questionnaire: • Keep it short, simple, and clear • Include only needed questions for indicators • Should be compatible with the reading level of the respondents • When you use closed-ended questions make sure to include all possible answer choices.

  26. Instrument DevelopmentStep 7 Format Your Instrument • Appearance and editing of your instrument are important determinants of response rate. • Therefore, format, structure, and edit your instrument professionally.

  27. Instrument DevelopmentStep 8 Establish Validity and Reliability of Your Instrument • Reliability refers to the extent to which a measuring instrument is consistent in measuring what it measures. • Test-retest method : We administer the instrument to a sample of subjects on two occasions and correlate the paired scores to establish the reliability. • Validity refers to the extent to which an instrument measures what it intends to measure. • Use experts’ views to establish validity.

  28. APPLICATION OF STEPS

  29. Determine Your Evaluation Questions • Identify the precise questions need to be answered. • Use the logic model to narrow the focus of evaluation.

  30. LOGIC MODEL Measuring Program Impact INPUTS OUTPUTS OUTCOMES - IMPACT ActivitiesParticipationLEARNINGACTION IMPACT What resources does your program need to achieve its objectives? What should you do in order to achieve program goals and objectives? Who should - participate - be involved? - be reached? What do you expect the participants will know, feel or be able to do immediately after the program? What do you expect that participants will do differentlyafter the program? What kind of impact can result if the participants behave or act differently? Staff Volunteer Time & Money Materials Equipment Technology Partners Workshops Meetings Camps Demonstrations Publications Media Web site Projects Field Days Number of target clients Their characteristics Their reactions Awareness Knowledge Attitudes Skills Aspirations Behavior Practice Decisions Policies Social Action Social Economic Environmental

  31. Possible Question Categories • Process evaluation questions (These are mostly open-ended) • Questions on client characteristics • How do you describe your ethnicity? • Questions on program delivery • What are the strengths of this program? • What are the weaknesses? • Impact evaluation questions • Questions on clients satisfaction • Did the target clients find the program useful? • Outcomes • Did the program participants change KASA? • Did the program participants change their practices? • Impacts • Did the participants save money/improve health condition?

  32. What Data Are Needed for Program Improvement? • Were participants satisfied with: • Information received • Instructors • Facilities • Quality of training • What do they like/dislike about the training • Did the training meet their expectations? • If not, Why • Ideas for further improvement • Look for data that you can use to fix weaknesses and build on strengths.

  33. How to Collect Training Improvement Data? Please circle the appropriate number for your level of response.

  34. How to Collect Training Improvement Data? • Did the training session meet your expectation? • Yes • No • Would you recommend this training workshop to others? 1. Yes 2. No • If not, why:____________________________________ • What did you like the most about this training? • What did you like the least about this training? • How could this training be further improved?

  35. Other Data Demographics • What is your gender? ____ Male____ Female • How do you identify yourself? ___African American ___American Indian/Alaskan ___Asian ___Hispanic/Latino ___Native Hawaiian/Pacific Islander ___White ___Other

  36. What Data Are Needed for Program Accountability? • You need impact data • To prove that your program achieved its objectives

  37. How to Document Perceived Knowledge Change? Example for Agriculture

  38. How to Document Levels of Aspirations? • At the end of a successful training session, participants will have a heightened level of aspirations to apply what they learned. • They are ready to “taking charge” of what they learned. • Participants are asked whether they intend to apply what they learned. • Example: As a result of this training, do you intend to drink reduced fat milk? The answers to this question would be: • No • Maybe • Yes • I’m already doing this

  39. How to Document Aspirations? Example for FCS Please circle the number that best describes your answer.

  40. Retrospective Pre and Post Evaluations • Advantages: • Simple & Easy to collect data • Disadvantages: • Not appropriate for collecting data from very young audiences and low literacy adult audiences. Because they will not be able to compare before and after situation retrospectively.

  41. Pre and Post Evaluations • Pre Evaluation is administrated before your training session. • Post Evaluation is administrated at the end of your training session. • We need to match pre and post evaluations for comparison. • Pre and Post Evaluation will document three impact indicators: • Change in Knowledge • Change in Skills • Levels of Aspirations

  42. How to Document Change in Knowledge? • Ask same set of questions before and after your educational session and compare their answers to document the knowledge gain from the program.

  43. How to Document Change in Knowledge? Example for FCS

  44. How to Write Knowledge Testing Questions: • Don’t use general knowledge questions. • Don’t include attitudinal or perceptual statements. • Example: Growers should practice conservation tillage. __True __False ___Don’t Know

  45. True and False Questions vs Multiple Choice Questions • True and False questions save your time and respondents’ time. • Easy to analyze. • Help you keep your survey short.

  46. How to Document Change in Skills? • Skill changes are measured indirectly by using participants’ levels of confidence to carry out the learned tasks from the program. Example: Participants’ confidence in their ability to calibrate a sprayer.

  47. How to Document Change in Skills? • We record their levels of confidence for carrying out specific tasks before and after the program on a Likert-type scale. • Compare pre and post responses to document changes in skills.

  48. How to Document Change in Skills? Example for Agriculture:

  49. Pre and Post Evaluations • Advantages: • Appropriate for young and low reading audiences. • Disadvantages: • If you want to compare pre and post evaluations you must match pre and post evaluations for each participant. • This is somewhat challenging.

  50. Change Attitudes • Difficult to measure • Need to be very careful in designing scales to measure attitudes • Not a practical indicator • Pre/Post tests

More Related