1 / 34

Survey Research

Survey Research. A German political sociologist mailed 25,000 questionnaires to research exploitation of workers by their employers Does your employer or his representative resort to trickery in order to defraud you of a part of your earnings?

nash
Télécharger la présentation

Survey Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Survey Research A German political sociologist mailed 25,000 questionnaires to research exploitation of workers by their employers Does your employer or his representative resort to trickery in order to defraud you of a part of your earnings? If you are paid piece rates, is the quality of the article made a pretext for fraudulent deductions from your wages? This German political sociologist was Karl Marx, who sent out 25,000 such questionnaires but did not record how many of them were returned.

  2. Topics for Survey Research • Using appropriate sampling technique to identify a collection of respondents to represent an entire population to describe, explore, or explain social phenomenon. • Send your questionnaires to the identified respondents to obtain relevant information • Despite numerous influential work on women sexuality, Babbie criticized Hite, Shere for using unsound data collection methods.

  3. Women Sexuality • Hite distributed 100,000 questionnaires and received 4.5% response rate. Based on her finding, Hite wrote Women and Love, a best-selling book claiming that women are fed up with men. For example, 91% of the divorced women in the sample said that they had initiated the divorce and 70% of the married women said that they had committed adultery. What is the problem with her data collection process.

  4. Phony “Surveys” • Telemarketers conduct survey to sell goods • Political parties conduct survey to persuade rather than measure people’s political orientation • Charitable organizations conduct survey to solicit donations. A research institute called you to survey your satisfaction with police work and went ahead asking for donation to better protect police officers.

  5. Asking Good Questions • Questionnaire is a collection of questions. To measure organizational commitment, researchers used the following items • 1. I am willing to work harder than I have to in order to help this organization succeed; • 2. I feel very little loyalty to this organization; • 3. I would take almost any job to keep working for this organization; • 4. I find that my values and the organization’s values are quite similar; • 5. I am proud to be working for this organization; and • 6. I would turn down another job for more pay in order to stay with this organization.

  6. Org Commitment • Except for question 2, the coding method for the five questions is: strongly agree (4), agree (3), disagree (2), and strongly disagree (1). Question 2 is reversed coded as strongly agree (1), agree (2), disagree (3), and strongly disagree (4). The results are summed up and divided by 6, producing a commitment scale from 0 to 4, indicating a range from low commitment to high commitment.

  7. Open-Ended and Closed-Ended Questions • Open-ended question: “what do you feel is the most important issue facing the United States today?” • Closed-ended questions: “Choose one item from the following list that indicate what you think the most important issue facing American today (1) fight terrorism; (2) Deal with economic recession; or (3) control nuclear weapon” • What are the advantages and disadvantages for each approach?

  8. Issues in Wording • Questions should be mutually exclusive and collectively exhaustive • Items need to be crystal clear • Before 1994, CPS (Current Population Survey) asked respondents to report their employment during the last week, which is officially defined as Sunday through Saturday. Many respondents believe it refers to Sunday through Friday. • Census survey often have overrepresentation of American Indian when they used Native American to refer to American Indian because many respondents understand “Native American” to mean “born in the United States”

  9. More Issues • Double-Barreled Questions: what’s the problem with the following questionnaire items asking respondents to reply “Strongly agree, agree, disagree and strongly disagree” • “The United States should abandon its space program and spend the money on domestic programs.” • “If Iraq increases its terrorist acts against U.S. and we keep inflicting more damage on Iraq, then inevitably it will end in the U.S. going to war and finally invading Iraq, which would be wrong.”

  10. Competent Respondents • Respondents need to be competent to provide reliable information. Researchers asking children their age when they first talk back to their parents may have unreliable responses as many children cannot remember it. Other researchers asking the mileage for teenage drivers receiving responses of hundreds of thousands of miles.

  11. Willingness Respondents • During cultural revolution from 1966 to 1976, criticism of communism results in death penalty. You designed a questionnaire item to survey people’s likeness of communism from strongly agree to strongly disagree in 1970, what results you would have? (Also see Bian 1994: 19-20) • Another example is interviews to survey respondents’ criminal record by asking them to report their criminal activities.

  12. Relevant Answers • Watch out for respondents making up their answers. Sadly, no way you can normally distinguish which is good answer and which is bad answer. More research need to be done on this methodology issue. • Make your questions short, concise, precise, clear, and SIMPLE. Respondents hate complex questions, would not study them, will produce wrong answers to ambiguous questions.

  13. Avoid Negative Items • Try not to use “not” • Avoid biased terms and items • You want to study police racial profiling and interviewed randomly selected 100 police officers whether they issue traffic tickets based on driver’s race, 100% of your respondents will say no. • Sometimes, this issue becomes a little subtle. A researcher asking people’s attitude towards a decision made by supreme court may over-represent respondents agreeing to that decision.

  14. Biased Questions • In general, people want to be looked as nice, even when they answer questionnaire surveys • A racist or sexist will certainly answer “yes” to whether they support equal opportunity plan for non-white and women. • Solution is try to frame a objective question that produces absolutely no persuasion to affect respondents’ choices. • Look at the similar research in this area

  15. Questionnaire Construction • Questionnaire format • Make your questionnaire spread out and uncluttered • Do not use abbreviate words, do not squeeze several questions into one line. • To facilitate responses, researchers commonly use the following format 1 2 3

  16. Contingency Questions • Q1: Are you eligible for voting? 1: Yes (if choose yes, go to question 5) 2: No (if choose no, skip question 5 to 6) • Q5: whom you voted for state governor? • Q6: whom you voted for the U.S. president?

  17. Ordering Items • Naturally early questions affect responses to later questions. An classical example is a questionnaire survey about terrorism and final question is about what is the most important issue facing American people today. What types of answer respondents are likely to supply if questionnaire is so designed. • Sequence affects respondents to different extent. Research methodologists reported that questionnaire ordering affects less educated respondents more than it does to more educated respondents.

  18. Solution? • For self-administered questionnaire: put interesting/substantive questions first, and the basic survey questions such as gender, race, DOB at the end • For interviews, interviewers should do the opposite, glancing the household, recording the address, obtaining interviewees’ vita data, and then move on to substantive questions “how often do you go to church?”

  19. Some advise • Start your questionnaire with a brief introduction. • Hello, I'm calling from the University of Maryland Survey Research Center. My name is . We are conducting a study with Princeton University. May I speak with ? • A few days ago, a letter from Dr. _____ at Princeton University was sent to you to ask for your participation in a study of human resources policies. We would like to collect information on changes in your organization's human resources policies, including the years those changes were made.  • Your responses are strictly confidential, and your answers will not be associated with your name or the name of your firm. The interview should take about 15 to 20 min (Kelly, Erin 2002).

  20. Advise • Add a brief introduction before each subsection • Next I'll ask about various human resources policies. Please think of the policies covering [fill employees= title or position] when answering these questions. The first topic is leave for family situations. These leaves go by different names, such as family leave, maternity leave, pregnancy disability leave, and others. I'll ask how these types of leave have been handled by [fill name of organization], but the particular name you use for each leave is not important (Kelly, Erin 2002).

  21. Advise 2 • Make it very clear how many responses you want informant to check • From the list below, please check the primary reason for you to attend college (Please check ONE best answer) • If questionnaire items contain rank order, please make it very clear that the answer is weighted differently.

  22. Pretesting the Questionnaire • Old Chinese saying, “Generals planning their battles on the paper are always defeated.” • Same logic for the questionnaire survey researchers. • No matter how wonderful a questionnaire appear to be, you can always find its problem by pretesting it to a limited number of real respondents.

  23. An Illustration • Self-administered questionnaires: respondents complete the questionnaires themselves • Interviewer survey: survey administrated by interviewers in face-to-face encounters • Telephone survey: survey administrated by interviewers by telephone calls

  24. Other Methods • Self-administrated questionnaire: mail survey • Home delivery: a researcher worker delivers the questionnaire to the home of sample respondents and explains the study. Questionnaire is left for the respondents to fill out and later on picked up by the researcher worker. • Some combination: mailed questionnaire, followed by research workers to collect them. Or researcher workers deliver the questionnaire, respondents then send it back to survey officers.

  25. More Techniques • Lots of more methods to administrate questionnaire survey (Heether and Kendall can help us) • Mail Distribution and Returns • Send out a questionnaire accompanied by a letter of explanation and a self-addressed stamped envelope for returning the questionnaire

  26. Mailed Questionnaire • You are not likely to receive many responses if you ask respondents to • Find an envelope • Write the address • Figure out the cost of stamps • Buy the stamp • Put it on the envelope • So to maximize return, you need to minimize all those troubles your respondents have to go through to complete this process

  27. Bulk-rate vs. First-class • It costs you less if you send out questionnaires using bulk-rate but is less certain about the delivery. First class costs more but more certain about the delivery • On return mails, postal stamps require you to pay for every letter you send out, regardless of whether respondents return or not. • You can use business reply, which requires you to pay only those returned but also additional surcharge of 10 cents per letter.

  28. Monitoring Returns • Record those returning mails by assigning a unique ID numbers to each returned mail. • The ID number should reflect the timing series to see a longitudinal effect on responses • Two graphs to illustrate responses. One is everyday log to show how many responses received per day. Another shows how many responses received so far (cumulative file). • Research methodologists reported that race, education level, gender, and other vital variables may affect likelihood for people to respond.

  29. Follow-up Mailings • Upon certain dates after you send out the original questionnaires, you may want to send out follow-up letters to those non-respondents, in which you enclosed a letter encouraging them to participate and the original questionnaire • Babbie argued that you can do two rounds of follow-up with about 2-3 weeks as the time interval. • University of Hawaii sent out the questionnaire and received 40% respondents in two weeks. Their first round follow-up produces another 20%. In another two weeks they sent out the second round follow-up, which produces additional 10% respondents.

  30. Acceptable Rate • Some rough estimate of acceptable response rate, calculated as percentage of those returned responses divided by total questionnaires sent out. • 50% is adequate • 60% is good • 70% is very good • Below 50%, you need to concern about the sample selection bias. Those responding to your questions are significantly different from those not responding to your questions in some respects.

  31. Response Rate • For example, you are interested in workplace sexual harassment and sent out 1000 questionnaires to randomly selected respondents. You received only 100 responses (10% response rate). All of those 100 respondents are women and all of them indicated serious sexual harassment they encountered in their workplaces. But you speculate the rest 900 non-respondents may not experience such a problem as those 100 respondents. Non-random sample selection occurs.

  32. A Case Study • Babbie focused on 773 randomly selected students to conduct his research within the University of Hawaii • They sent out questionnaires and a post cards addressed back to the research office to those 773 respondents. • The introductory letter with each package asks students to complete the questionnaire and to return them along with the postal card. • Using postal cards, researchers identify who returned and who did not.

  33. A Case Study 2 • A cover letter on top of the 32-page questionnaire explains the purpose of this study, how respondent students were selected, the importance of participation, the instruction for returning the questionnaire, a guarantee of anonymity, a phone number for further questions, and more information request. • Using bulky-rate to send out those questionnaires • Returned questionnaires are scanned, IDed, and logged.

  34. A Case Study 3 • Researchers sent out the second round of questionnaire to non-respondents. • Repeat the same procedure of logging and IDing returned questionnaires. • Final response rate is 62% after two rounds of solicitation • How many valid questionnaires they obtained?

More Related