1 / 66

Research Design Choices and Causal Inferences

Research Design Choices and Causal Inferences . Professor Alexander Settles. Approaches and strategies of research design. Quantitative Strategies Exploratory Studies In-depth interviews Focus Group studies Descriptive Studies Survey research Relationship studies Causal Studies

hart
Télécharger la présentation

Research Design Choices and Causal Inferences

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research Design Choices and Causal Inferences Professor Alexander Settles

  2. Approaches and strategies of research design • Quantitative Strategies • Exploratory Studies • In-depth interviews • Focus Group studies • Descriptive Studies • Survey research • Relationship studies • Causal Studies • Experimental • Comparative

  3. Approaches and strategies of research design • Qualitative Strategies • Explanatory Studies • Case Studies • Ethnographic studies • Interpretive studies • Grounded theory • Critical Studies • Action research

  4. Gathering Qualitative Data • Observation Studies • Participation Studies • Interviewing Studies • Archival studies • Media Analysis

  5. What Tools Are Used in Designing Research?

  6. Language of Research Clear conceptualization of concepts Success of Research Shared understanding of concepts

  7. What Is Research Design? Blueprint Plan Guide Framework

  8. What Tools Are Used in Designing Research? MindWriter Project Plan in Gantt chart format

  9. Design in the Research Process

  10. Descriptors of Research Design Question Crystallization Perceptual Awareness Data Collection Method Descriptors Purpose of Study Experimental Effects Time Dimension Research Environment Topical Scope

  11. Exploratory Study Loose structure Expand understanding Provide insight Develop hypotheses Formal Study Precise procedures Begins with hypotheses Answers research questions Degree of Question Crystallization

  12. Approaches for Exploratory Investigations • Participant observation • Film, photographs • Psychological testing • Case studies • Ethnography • Expert interviews • Document analysis

  13. Desired Outcomes of Exploratory Studies • Established range and scope of possible management decisions • Established major dimensions of research task • Defined a set of subsidiary questions that can guide research design • Developed hypotheses about possible causes of management dilemma • Learned which hypotheses can be safely ignored • Concluded additional research is not needed or not feasible

  14. Commonly Used Exploratory Techniques • Secondary Data Analysis • Experience Surveys • Focus Groups

  15. Experience Surveys • What is being done? • What has been tried in the past with or without success? • How have things changed? • Who is involved in the decisions? • What problem areas can be seen? • Whom can we count on to assist or participate in the research?

  16. Descriptors of Research Design Question Crystallization Perceptual Awareness Data Collection Method Descriptors Purpose of Study Experimental Effects Time Dimension Research Environment Topical Scope

  17. Data Collection Method • Monitoring • Communication

  18. Descriptors of Research Design Question Crystallization Perceptual Awareness Data Collection Method Descriptors Purpose of Study Experimental Effects Time Dimension Research Environment Topical Scope

  19. The Time Dimension • Cross-sectional • Longitudinal

  20. Descriptors of Research Design Question Crystallization Perceptual Awareness Data Collection Method Descriptors Purpose of Study Experimental Effects Time Dimension Research Environment Topical Scope

  21. Statistical Study Breadth Population inferences Quantitative Generalizable findings Case Study Depth Detail Qualitative Multiple sources of information The Topical Scope

  22. Descriptors of Research Design Question Crystallization Perceptual Awareness Data Collection Method Descriptors Purpose of Study Experimental Effects Time Dimension Research Environment Topical Scope

  23. The Research Environment • Field conditions • Lab conditions • Simulations

  24. Descriptors of Research Design Question Crystallization Perceptual Awareness Data Collection Method Descriptors Purpose of Study Experimental Effects Time Dimension Research Environment Topical Scope

  25. Purpose of the Study Reporting Descriptive Casual -Explanatory Causal -Predictive

  26. DescriptiveStudies • Who? • What? • How much? • When? • Where?

  27. Descriptive Studies • Description of population characteristics • Estimates of frequency of characteristics • Discovery of associations among variables

  28. Descriptors of Research Design Question Crystallization Perceptual Awareness Data Collection Method Descriptors Purpose of Study Experimental Effects Time Dimension Research Environment Topical Scope

  29. Experiment Study involving the manipulation or control of one or more variables to determine the effect on another variable Ex Post Facto Study After-the-fact report on what happened to the measured variable Experimental Effects

  30. Causation and Experimental Design • Control/Matching • Random Assignment

  31. Causal Studies • Symmetrical • Reciprocal • Asymmetrical

  32. Understanding Casual Relationships Property Behavior Response Disposition Stimulus

  33. Asymmetrical Casual Relationships Stimulus-Response Property- Disposition Property- Behavior Disposition-Behavior

  34. Types of Asymmetrical Causal Relationships • Stimulus-response - An event or change results in a response from some object. • A change in work rules leads to a higher level of worker output. • A change in government economic policy restricts corporate financial decisions. • A price increase results in fewer unit sales. • Property-disposition - An existing property causes a disposition. • Age and attitudes about saving. • Gender attitudes toward social issues. • Social class and opinions about taxation. • Disposition-behavior • Opinions about a brand and its purchase. • Job satisfaction and work output. • Moral values and tax cheating. • Property-behavior -An existing property causes a specific behavior. • Stage of the family life cycle and purchases of furniture. • Social class and family savings patterns. • Age and sports participation.

  35. Evidence of Causality • Covariance between A and B • Time order of events • No other possible causes of B

  36. Descriptors of Research Design Question Crystallization Perceptual Awareness Data Collection Method Descriptors Purpose of Study Experimental Effects Time Dimension Research Environment Topical Scope

  37. Participants’ Perceptional Awareness • No Deviation perceived • Deviations perceived as unrelated • Deviations perceived as researcher - induced

  38. Descriptors of Research Design

  39. Readings Review • Buchanan, D. A. & Bryman, A. (2007). Contextualizing methods choice in organizational research. • Choice of methods does not depend exclusively on links to research aims; • A method is not merely a technique for snapping reality into focus; choices of method frame the data windows through which phenomena are observed, influencing interpretative schemas and theoretical development. • Research competence thus involves addressing coherently the organizational, historical, political, ethical, evidential, and personal factors relevant to an investigation

  40. Organizational Properties • Negotiated objectives. • Layered permissions. • Partisan conclusions. • Politics of publishing.

  41. Conclusions • attributes of the organizational research setting or context, • the research tradition or history relevant to a particular study, • the inevitable politicization of the organizational researcher’s role, • constraints imposed by a growing concern with research ethics, • theoretical and audience-related issues in translating evidence into practice, and • personal preferences and biases with regard to choice of method.

  42. Bergh, D. D. & Fairbank, J. F. (2002). Measuring and testing change in strategic management research • Reliability assumptions of change variables, correlations between the change variable and its initial measure, and selection of unbiased measurement alternatives. • Found that the typical approach used to measure and test change (as a simple difference between two measures of the same variable) is usually inappropriate and could lead to inaccurate findings and flawed conclusions.

  43. Recommendations • Researchers to screen their data in terms of: • reliability of component measures • correlation between the component variables • equality of component variable variances; and • correlation with the initial component variable.

  44. Validity and Reliability Part 2

  45. Validity • Determines whether the research truly measures that which it was intended to measure or how truthful the research results are.

  46. Reliability • The extent to which results are consistent over time and an accurate representation of the total population under study is referred to as reliability and if the results of a study can be reproduced under a similar methodology, then the research instrument is considered to be reliable.

  47. Methods to Insure Validity • Face Validity- This criterion is an assessment of whether a measure appears, on the face of it, to measure the concept it is intended to measure. • Content validity concerns the extent to which a measure adequately represents all facets of a concept. • Criterion-related validity applies to instruments than have been developed for usefulness as indicator of specific trait or behavior, either now or in the future. • Construct Validity - which concerns the extent to which a measure is related to other measures as specified by theory or previous research. Does a measure stack up with other variables the way we expect it to?

  48. Reliability • Test-Retest Reliability • Inter-Item Reliability – use of multiple items to measure a single concept. • Inter-observer Reliability - the extent to which different interviewers or observers using the same measure get equivalent results.

  49. Graphic comparison

More Related