1 / 21

Epistemology and Methods Data Selection, Operationalization, and Measurement May 5 2009

Epistemology and Methods Data Selection, Operationalization, and Measurement May 5 2009. From Theorie to Empirical Testing!. Building on Research question/puzzle-driven Existing literature Model building (concepts, key variables, arguments, hypotheses) Underlying causality story

Télécharger la présentation

Epistemology and Methods Data Selection, Operationalization, and Measurement May 5 2009

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Epistemology and MethodsData Selection,Operationalization, and Measurement May 5 2009

  2. From Theorie to Empirical Testing! • Building on • Research question/puzzle-driven • Existing literature • Model building (concepts, key variables, arguments, hypotheses) • Underlying causality story • Now: Operationalization for the purpose of testing • Formulating Variables into Measures • Engage in Observation/Measuring

  3. From Theorie to Empirical Testing! • Weaknesses of research design (see also KKV) • Indeterminate research designs • more inferences than observations (more explanatory variables than cases) • Multicollinearity • Selection Bias • Measurement Errors…

  4. Key decision: determining what to observe Selection and Danger of Bias Violation of key rule: conditional independence • Assumption that observations/values assigned to IV are independent of the values of DV • IV is correlated with DV (endogenous) • DV causes IV (reverse causality)

  5. Random Selection vs. Intentional Selection • In large-n research (universe of cases, random selection) • In small-n research (intentional selection) • Random Selection and its limits • Powerful: automatically uncorrelated with all variables (controlled experiments: random selection, treatment variables (explanatory variables)) • You need to know the universe of cases! • In qualitative research: selection bias is more often present!

  6. Selection Bias • Comparative study on wars, which wars? • The performance of IOs, which IOs? • Influence of civil servants /interviewing civil servants, which ones? (e.g. Snow-ball technique in elite interviews) • Example (KKV) • US investment in developing countries as prime cause for internal violence • Selection: Nations with major US investments with great deal of internal violence; nations without major US investments and no internal violence.

  7. Selection on the Dependent Variable • Selection should allow for some variation on the DV (key rule) • Variation could be truncated: only limited observation of variance on DV that exists in real world • Example: Effect of number of accounting courses (IV), salary (DV) • KKV, Figure 4.1 • Other examples: • Why do wars occur: only select wars! • What explains trade disputes? Which disputes are being litigated over…

  8. Various relationsships (Geddes)

  9. Various relationsships (Geddes)

  10. Various relationsships (Geddes)

  11. Various relationsships (Geddes)

  12. Selection on the Dependent Variable • Geddes Examples: • Labor repression significantly affect growth rates • Skocpol’s States and Social Revolutions • French, Russian and Chinese revolutions (and contrasting cases: Prussia, Japan) • “…an assessment of the argument based on a few cases selected from the other end of the DV carries less weight than would a test based on more cases selected without reference to the DV…”

  13. Selection on the Explanatory Variable • No inference problem • We could limit the generality of our conclusion, or the explanatory variable does have no effect on DV, but we introduce no bias… • Other research strategy: controlling for an important IV (focus on other explanatory variables)

  14. Measurement Issues Case studies: • (Historian) method of causal imputation/interpretation (not causal inference) • Where bias is possibly introduced: • Weighting explanations / IV • Value of primary sources (original purpose of documents) • Bias of secondary sources • Overestimation of rationality of decision-makers • Interpreting unobserved processes (e.g. which data from below was taking up by the top-level? ) • Double-check sources (triangulation)

  15. Watch out for! Validityof measurements Do we measure what we think we are measuring? e.g. survey questions? Reliability Applying the same method will yield the same results e.g. survey questions? Can results bereplicated?

  16. Replication • Data and applied methods (e.g. regression analysis) • Sources, secondary literature, direct observation (more difficult: impressions and weighting of factors) • Ensure access to material for future researchers (data, unpublished/private records) • Use coding sheets/ coding rules

  17. Measuring Political Democracy • Bowman et al. 2005 on “data-induced measurement errors” • Challenge of measuring • Conceptualization • Operationalization by construction of measures • Aggregation of measures

  18. Measuring Political Democracy • Inaccurate, partial or misleading secondary sources, threat to validity! • Remedy: Use of area experts in the coding • Example: Coding of Central American Countries in • Gasiorowski 1996 • Polity IV 2002 • Vanhanen 2000 • Weak correlation among CA cases! Validity issue! Coders measure different things!

  19. Measuring Political Democracy • New coding /index on five dimensions: • Broad political liberties, competitive elections, inclusive participation, civilian supremacy, national sovereignty • Coding (0; 0.5; 1) • All elements are necessary conditions, not standard aggregation • Use of fuzzy-set rules…”weakest score”

  20. Summary: Skepticism! • Conceptualization, operationalization and measurement • Case selection & samples • Explain your selection of the cases you analyze • Show awareness of the universe of data • Measuring • Explain what “proxies” you use for measuring the variable • Think about validity and reliability (how to “improve” or control for bias…)

  21. Next week • Carry out the empirical testing (various methods available) • Choosing quantitative or/and qualitative methods

More Related