1 / 18

Quantitative Methods: Survey Design

Quantitative Methods: Survey Design. Valerie Dao and Matthew Schwarz Fulbright Research Mentorship Program Ho Chi Minh City, Vietnam. The Basics of Survey Design. Surveys provide a numeric (quantitative) description of the trends, attitudes, or opinions within a given population

nellis
Télécharger la présentation

Quantitative Methods: Survey Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quantitative Methods: Survey Design Valerie Dao and Matthew Schwarz Fulbright Research Mentorship Program Ho Chi Minh City, Vietnam

  2. The Basics of Survey Design • Surveysprovide a numeric (quantitative) description of the trends, attitudes, or opinions within a given population • The researcher uses a subset of the population—or a sample, to generalize and draw conclusions about the entire population

  3. Components of a Survey Method Plan • 1. Clarify the purpose • 2. Assess Resources • 3. Population and Sample • 4. Variables in the study • 5. Instrumentation • 6. Collect Data • 7. Process data • 8. Analyze results

  4. Step 1: Clarifying the Purpose • Why conduct a survey? • Who are the stakeholders? • Who is the population of interest? • What issues need to be explored?

  5. The Initial Proposal • The first section of the survey design should highlight : • 1. basic purpose • Draw from a sample to generalize about a population • 2. rational for survey research • In terms of measuring the variable and convenience of data collection in this method • Indicate time frame • Cross sectional vs. longitudinal • Form of data collection • Questionnaires, interviews, structured record reviews, structured observations

  6. Step 2: Assess the Resources • What are you internal resources? • What are you external resources?

  7. Assess the Resources • You need to explore the available resources at your disposal to better plan and edit your survey • Internal • Your research institution • Budget • Facilities • Time • [staff] • External • Outside funding • Fellowships, grants

  8. Step 3: Population and Sample • You will not be able to test the entire population, so you need to define a sample to draw conclusions from • How many people will be included? • What is the size of your target population? • What can the budget allow? • How will the size affect your results? • How will the respondents be selected?

  9. The Population and the Sample • Types of sampling • Single-stage: access to names in population, direct • Multi-stage(clustering): sample organizations/groups initially, obtain more info from within those clusters, and then samples with the given information • Identify selection process • Random sample vs. non-probability • Stratification • Proportionality and representation of true population • Take samples from each subgroup within a population

  10. Step 4: Variables in the study • Relate what is measured (the variable) directly to the questions in the instrument • Identify the independent and dependent variables

  11. Step 5: Instrumentation • The survey instrument is the actual questionnaire or data collection document that will be used in the study • It can be an original document, a modified instrument, or an intact instrument that someone else has already implemented • When writing your own instrument, focus on what you need to know

  12. Designing your survey • Open vs. closed questions • Types of response formats • Ratings • Rankings • Multiple choice • Yes/no • Types of measurement • Attitudes • Knowledge • Beliefs • Behaviors • evaluation

  13. How to effectively design your questions • Your questions need to have validity • Determines whether or not you can draw meaningful and useful inferences from your data • When designing your questions you want to make sure that you control for • 1. Bias • 2. Precision

  14. Validity of instrumentation • Accuracy is how close the estimator is to the true value of the parameter being measured • Precision refers to the repeatability of the measurement • If the instrument is both accurate and precise then it is consider to be valid • Accuracy relates to the quality of the result whereas precision is the quality of the operation by which the result is obtained

  15. Step 6: Collect Data • What is the medium you will use for collecting your data? Consider what you are asking and what will be most convenient and comfortable for your respondents • Main Methods • In person • Mailing • Electronic/online • Telephone

  16. Step 7: Processing Data • Coding • Open ended questions • Data Entry • Set-up your document collection • Avoid errors

  17. Step 8: Analyzing your results • How will you use the data you have collected? • 1. Report on level of participation • 2. Response bias • How will the people who did not respond change the results of your survey? • 3. Plan to provide descriptive analysis for dependent and independent variables • Means, standard deviations, range, etc. • 4. identify the statistics/program for testing major questions or hypotheses in your study • Rationale for each test accompanied

More Related