1 / 64

Polls and surveys

Polls and surveys. Evaluate Statistically Based Reports. Figure 7.4 Classification of Survey Methods. Classification of Survey Methods. Survey Methods. Telephone. Personal. Mail. Electronic . In-Home. Mail/Fax Interview. E-Mail. Traditional Telephone. Mall Intercept. Mail Panel .

cherie
Télécharger la présentation

Polls and surveys

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Polls and surveys Evaluate Statistically Based Reports

  2. Figure 7.4 Classification of Survey Methods Classification of Survey Methods Survey Methods Telephone Personal Mail Electronic In-Home Mail/Fax Interview E-Mail Traditional Telephone Mall Intercept Mail Panel Computer-Assisted Telephone Interviewing Computer-Assisted Personal Interviewing Internet

  3. Other Data collection methods • Experimental design • Subjects are randomly assigned to treatments (=variables) by the researcher • Causal inferences are stronger • Random sampling from the population less important • Usually laboratory Observational design (e.g., surveys) • Subjects are not randomly assigned to variables • Random sampling is important. • Selection bias • Causal inferences are compromised.

  4. probability sampling methods • Systematic random sample: • Stratified random sample: • Cluster sampling: • Multistage sampling:

  5. probability sampling methods • Systematic random sample: • pick a random case from the first k cases of a sample; select every kth case after that one • Stratified random sample: • Cluster sampling: • Multistage sampling:

  6. probability sampling methods • Systematic random sample: • pick a random case from the first k cases of a sample; select every kth case after that one • Stratified random sample: • divide a population into groups, then select a simple random sample from each stratum • Cluster sampling: • Multistage sampling:

  7. probability sampling methods • Systematic random sample: • pick a random case from the first k cases of a sample; select every kth case after that one • Stratified random sample: • divide a population into groups, then select a simple random sample from each stratum • Cluster sampling: • divide the population into groups called clusters or primary sampling units (PSUs); take a random sample of the clusters • Multistage sampling:

  8. probability sampling methods • Systematic random sample: • pick a random case from the first k cases of a sample; select every kth case after that one • Stratified random sample: • divide a population into groups, then select a simple random sample from each stratum • Cluster sampling: • divide the population into groups called clusters or primary sampling units (PSUs); take a random sample of the clusters • Multistage sampling: • several levels of nested clusters, often including both stratified and cluster sampling techniques

  9. Random Sample Size • Sample size is a function of three things: • Size of the population of interest • Decision about how important is it to be accurate? • Confidence level • Decision about how important is to be precise? • Sampling error (also called margin of error) or confidence interval • In general, accuracy and precision is improved by increasing the sample size Dr. G. Johnson, www.ResearchDemsytified.org

  10. Questionnaire Design… • Over the years, a lot of thought has been put into the science of the design of survey questions. Key design principles: • Keep the questionnaire as short as possible. • Ask short, simple, and clearly worded questions. • Start with demographic questions to help respondents get started comfortably. • Use dichotomous (yes|no) and multiple choice questions. • Use open-ended questions cautiously. • Avoid using leading-questions. • Pretest a questionnaire on a small number of people. • Think about the way you intend to use the collected data when preparing the questionnaire.

  11. A poll allows you to ask one (or few) multiple choice question. Participants can choose from among answers that you predefine. You can allow the voter to select just one answer or allow them to choose multiple answers. You also have the option of adding an Other field to allow a voter to enter their own answer. What is a poll

  12. Example

  13. full, • restricted, • learner, • none What type of driving licence do you hold ?

  14. 15% of New Zealanders have a restricted licence. Do you have a problem with this statement?

  15. target population, • sample, • random selection, • making an inference Did you consider

  16. The target population is that complete group whose relevant characteristics are to be determined through the sampling • The target group should be clearly delineated if possible, for example, do all pre-college students include only primary and secondary students or also students in other specialized educational institutions? Target population

  17. The sampling frame is a list of all those population elements that will be used in the sample • Examples of sampling frames are a student telephone directory (for the student population), the list of companies on the stock exchange, the directory of medical doctors and specialists, the yellow pages (for businesses), the electoral role. • Often, the list does not include the entire population. The discrepancy is often a source of error associated with the selection of the sample (sampling frame error) Sampling Frame

  18. Probability Sampling – Every element in the population under study has a non-zero probability of selection to a sample, and every member of the population has an equal probability of being selected • Non-Probability Sampling – An arbitrary means of selecting sampling units based on subjective considerations, such as personal judgment or convenience. It is less preferred to probability sampling

  19. TV polls e.g. The Vote, Campbell Live, X Factor • Radio polls • Internet polls • Phone polls Give some examples of where you see this form of questioning

  20. A survey allows you to ask multiple questions across a wider range of question types. So you can ask for a comment, an email address, a name, an address etc., as well as multiple choice questions. What is a survey?

  21. Example

  22. Sampling Errors (random process) Non-sampling Errors Selection bias Non-response bias Self selection bias Question effects Behavioural considerations Interviewer effects Survey-format effects Transfer of findings Sampling

  23. Selection bias is a statistical bias in which there is an error in choosing the individuals or groups to take part in a study. If the selection bias is not taken into account then certain conclusions drawn may be wrong.

  24. Target population (e.g. adults in NZ) Selection Bias: Population sampled is not (or should not be) exactly the population of interest. Sampling frame (e.g. households with a landline phone) Not included in sampling frame Cannot be contacted SAMPLED POPULATION Not eligible for survey Refuse to respond Incapable of responding

  25. Non-response bias When people who have been targeted to be surveyed do not respond: Sources of Non-sampling Errors

  26. Non-response bias occurs in statistical surveys if the answers of respondents differ from the potential answers of those who did not answer. e.g. Non-respondents in an employment survey are likely to be those who work long hours. Non-response bias

  27. Self-selection bias People decide themselves whether to be surveyed or not. Sources of Non-sampling Errors Dru Rose

  28. In such fields, a poll suffering from such bias is termed a self-selecting opinion poll or "SLOP". Most TV and radio polls are SLOPs.

  29. Participants' decision to participate may be correlated with traits that affect the study, making the participants a non-representative sample.

  30. An example is online and phone-in polls are self-selected. Those individuals who are highly motivated to respond, typically individuals who have strong opinions, are overrepresented, and individuals that are indifferent or apathetic are less likely to respond. This often leads to a polarization of responses with extreme perspectives being given a disproportionate weight in the summary. Self-selection bias

  31. People tend to answer questions in a way they consider to be socially desirable. e.g. pregnant women being asked about their drinking habits maybe reluctant to admit that they drink alcohol Behavioural considerations Dru Rose

  32. Different interviewers asking the same question can obtain different results. e.g. the sex, race, religion , manner of the interviewer may influence how people respond to a particular question. Interviewer effects Dru Rose

  33. The Bradley Effect Famous example

  34. A theory proposed to explain observed discrepancies between voter opinion polls and election outcomes in some United States government elections where a white candidate and a non-white candidate run against each other.

  35. The theory proposes that some voters will tell pollsters they are undecided or likely to vote for a black candidate, while on election day they vote for the white candidate.

  36. It was named after Los Angeles Mayor Tom Bradley, an African-American who lost the 1982 California governor's race despite being ahead in voter polls going into the elections.

  37. The Bradley effect theory posits that the inaccurate polls were skewed by the phenomenon of social desirability bias. Specifically, some white voters give inaccurate polling responses for fear that, by stating their true preference, they will open themselves to criticism of racial motivation.

  38. Members of the public may feel under pressure to provide an answer that is deemed to be more publicly acceptable, or 'politically correct'.

  39. In 1968, one year after a major racial disturbance in Detroit, a sample of black residents were asked: “Do you personally feel that you trust most white people, some white people or none at all?” • White interviewer: 35% answered “most” • Black interviewer: 7% answered “most” Interviewer Effects in Racial Questions

  40. Interviewer errors arise when • different interviewers administer a survey in different ways • differences occur in reactions of respondents to different interviewers, e.g. to interviewers of their own sex or own ethnic group • inadequate training of interviewers • inadequate attention to the selection of interviewers • there is too high a workload for the interviewer Interviewer error

  41. respondent gives an incorrect answer, e.g. due to prestige or competence implications, or due to sensitivity or social undesirability of question • respondent misunderstands the requirements • lack of motivation to give an accurate answer • “lazy” respondent gives an “average” answer • question requires memory/recall Behaviours

  42. Instrument or question errors arise when • The question is unclear, ambiguous or difficult to answer • the list of possible answers suggested in the recording instrument is incomplete • requested information assumes a framework unfamiliar to the respondent • the definitions used by the survey are different from those used by the respondent (e.g. how many part-time employees do you have? See next slide for an example) Instrument or question errors

  43. The following example is from Ruddock (1998) In the Short Term Employment Survey (STES) conducted by Office of National Statistics in UK, data are collected on numbers of full-time and part-time employees on a given reference date. Some firms ignored the reference date and gave figures for employees paid at the end of the month, thus including those who joined and those who left in that month – leading to an over-estimate. Firms found it difficult to give details of part-time employees as their definition of “part-time” did not agree with that used by ONS.

  44. -question order e.g. “To what extent do you think teenagers are affected by peer pressure when drinking alcohol ?” followed by: “ Name the top 5 peer pressures you think teenagers face today.” . Survey effects

  45. -survey layout -interviewed by phone or in-person or mail Survey effect

  46. Auckland sample New Zealand Taking the data from one population and transferring the results to another. e.g. Auckland opinions may not be a good indication of New Zealand opinions.

  47. can be much larger than sampling errors • are always present • can be virtually impossible to correct for after the completion of survey • virtually impossible to determine how badly they will affect the result • good surveys try to minimize them in the design of the survey (e.g. do a pilot survey first) Non-sampling Errors

  48. Questioning in polls

  49. Consider Wording Be aware that the wording of a question influences the answers. Examples: Is our government providing too much money for welfare programs? – 44% said “yes” Is our government providing too much money for assistance to the poor? – 13% said yes

  50. “Do you think there should be an amendment to the constitution prohibiting abortions?” Yes 29% No 62% Later the same people were asked: “Do you think there should be an amendment to the constitution protecting the life of the unborn child?” Yes 50% No 39% 18 August 1980 New York Times/CBS News Poll

More Related