1 / 25

Using Cognitive Interviews to Improve Survey Instruments

Using Cognitive Interviews to Improve Survey Instruments. Presented at the Association for Institutional Research Forum June 2-6, 2012.

elgin
Télécharger la présentation

Using Cognitive Interviews to Improve Survey Instruments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Cognitive Interviews to Improve Survey Instruments Presented at the Association for Institutional Research Forum June 2-6, 2012 Heather Haeger, Indiana UniversityAmber Lambert, Indiana University-BloomingtonJillian Kinzie, Indiana University-BloomingtonJames Gieser, Indiana University

  2. Agenda Introduction Conceptual Framework Methods Findings Conclusion and Discussion

  3. Current Context • Pressure to assess educational practices remains high • Crucial to ensure that instruments accurately measure educational practices and experiences • Questionnaires must measure what they intend; respondents must understand & correctly interpret items • NSSE’s widespread use (1,500 institutions) makes it particularly important to rigorously subject items to cognitive research testing

  4. Context for NSSE’s Cognitive Interviews Cognitive Interviews (CI) part of NSSE survey design from outset Focus in 2005 to test survey among historically under-represented students Planned NSSE update for 2013 provided occasion for multiple rounds of CIs

  5. Purpose • Purpose of cognitive interviews • Identify and analyze sources of response error • Focus • Cognitive processes • Access respondents interpretation & meaning of items

  6. Purpose (cont.) • In brief, cognitive interviews are meant to identify… • Whether subjects understand the question… • In a way consistent across subjects… • And in the way intended by researchers.

  7. Background • Four actions of the cognitive process: • Comprehend the question • Retrieve information • Make a judgment about relevance and accuracy • Formulate and provide a response

  8. Methods • “Think-aloud” • Explicit activity in which the subject verbalizes, his/her thought processes as s/he answers survey questions. • Interviewer reads the question, then observes and records as the subject responds. • Interviewer is mainly passive in process, aside from providing encouragement to “tell me what you’re thinking” if s/he hesitates or pauses. • Advantages:Freedom from bias imposed by frequent interviewer interjections; minimal interviewer training requirements; open-ended design. • Disadvantages:Subject usually requires training in method, or may resist technique; possibility for subject to stray from the topic at hand; subject may bias his/her description of his/her decision processing.

  9. Methods (cont.) • Verbal probing: concurrent and retrospective • Concurrent: after interviewer asks a question and subject answers, interviewer asks more specific questions designed to elicit further information about the response. Probes can be scripted or spontaneous. • Retrospective: at the end of interview, subject is asked to verbalize their thoughts about questions answered earlier when taking the questionnaire. • Advantages: Interviewer maintains control of the interview; relative ease of training the subject. • Disadvantages:Artificiality – criticism that this technique is not reflective of a real survey interview, in which interviewer simply asks questions and respondent answers them; potential for bias through poor selection of probes.

  10. Analysis 3 stages and related sub-stages that respondents faced during cognitive interviews: 1) Understanding the survey question and response options a) Comprehending the survey question b) Comprehending the response options 2) Performing the primary survey tasks a) Retrieving information b) Deduction; making conclusions about information c) Mental arithmetic computation 3) Formatting responses a) Mapping data yielded by primary task processes to an explicit response option b) Response option is not available/offered

  11. Types of Problems Coding within these stages can address any of the following problems: • Language problems • Inclusion/exclusion problems • Temporal problems   • Logical problems • Computational problems

  12. Example 1 a) Political views, b) Economic and social background, c) Religious beliefs or philosophy of life, d) Race, ethnic background, or country of origin, e) Sexual orientation Question: In your experience at your institution during the current school year, about how often have you had serious conversations with people who differ from you in the following ways? (Never, Sometimes, Often, Very Often)

  13. Example 1 (cont.) • Language Problem • Too wordy, too complex • Diversity of interpretation • Talked with people who are different • Talked with people who are different but only about that topic • Talked about that topic with anyone • Logic problem • Serious conversations + people who are different from you + in the following ways

  14. Example 2 Question: Indicate the quality of your interactions with the following people at your institution: a) Student Affairs Professional

  15. Example 2 (cont.) • Language problem • Not knowing what a Student Affairs Professional was • Wording was changed to “Student services staff (campus activities, housing, career services, etc.)” • Inclusion/exclusion problem • Students were including everyone • Added parenthetical to help narrow focus • This raises a question for researchers: are we ok with students thinking broadly about this question (e.g. including dining hall staff and campus security staff)? • Formatting problem • Response options ranged from Poor=1 to Excellent=7 • Need to add an NA option

  16. Example 3 Participated in a community-based project as part of a regular course (i.e., service-learning) Question: During the current school year, in about how many of your courses did you do the following?

  17. Example 3(cont.) • Language problem • This problem was site specific • Inclusion/Exclusion problem • Students wondered what activities to include • What counts as a “community-based” activity, and what did we mean by “community” (campus only? Surrounding neighbor/town too? Elsewhere in US/overseas?) • Some students thought of volunteering on their own, not in connection to a particular course

  18. Example 4 Participate in a formal program where groups of students take two or more classes together (sometimes called a learning community) Question: Which of the following have you done or do you plan to do before you graduate from your institution?

  19. Example 4 (cont.) • Language problem • As with service learning, students’ understanding of this question was also site specific • Formatting problem • Location on survey made students less likely to complete the item if they didn’t understand it • Solution:We restructured where the item appeared in the instrument • We moved the item lower on the page (it originally was the first question at the top of the page) • We moved the item lower within the question itself (it originally was the first item in series of 6 sub-questions

  20. Example 5 Question: In a typical week this year, about how many total pages have you read for all of your courses?

  21. Example 5(cont.) • Computation problem • Difficult for students to compute number of pages, especially if sources didn’t have page numbers (e.g. web pages, e-readers) • Inclusion Exclusion problem • Many students didn’t count readings in non-traditional book format (e.g. readings online, lab reports) • Some students counted what they actually had read, but some said that they counted all the pages assigned in the course, even those that they had not actually read.

  22. Applications to Your Campus How might these methods help triangulate NSSE results on your campus? What questions are you most concerned about in terms of what your students mean by their responses? Are there item terms that may have less face validity with your student populations? How might it help to know more about students’ interpretations of response option in terms of what to do with findings? How might you initiate this activity on your campus? Who might be interested in this work? Who should conduct the interviews?

  23. Conclusion • Cognitive interviews and focus groups can provide students an opportunity to reflect on their behaviors in college • Institutions gain concrete information about THEIR students experiences, perceptions • Can then be more explicit about opportunities and academic services of which students can take advantage

  24. Contact Information Heather Haeger – email: hahaeger@indiana.edu Amber D. Lambert – email: adlamber@indiana.edu Jillian Kinzie– email: jikinzie@indiana.edu James Gieser – email: jgieser@indiana.edu

  25. Introducing Updated NSSE! • Register Now for NSSE 2013 • (deadline Sept. 25) • Retains NSSE’s focus on diagnostic & actionable information • New Engagement Indicators • Academic challenge • Deep approaches to learning • Collaborative learning • Quantitative reasoning • Experiences with faculty • Campus environment • Interactions with diversity • Modules • New & Updated Items • Comparisons to Prior-Year Results • FSSE & BCSSE Updates

More Related