1 / 29

13: Examining Relationships in Quantitative Research

13: Examining Relationships in Quantitative Research. Relationships between Variables. Is there a relationship between the two variables we are interested in? How strong is the relationship? How can that relationship be best described?. Covariation and Variable Relationships.

Télécharger la présentation

13: Examining Relationships in Quantitative Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 13: Examining Relationships in Quantitative Research

  2. Relationships between Variables • Is there a relationship between the two variables we are interested in? • How strong is the relationship? • How can that relationship be best described?

  3. Covariation and Variable Relationships • Covariation is amount of change in one variable that is consistently related to the change in another variable • A scatter diagram graphically plots the relative position of two variables using a horizontal and a vertical axis to represent the variable values

  4. Positive Relationshipbetween X and Y

  5. Negative Relationship between X and Y

  6. Curvilinear Relationship between X and Y

  7. Scatter Diagram – No Relationship between X and Y

  8. Correlation Analysis • Pearson Correlation Coefficient (r) – measures the strength of a linear relationship between two metric variables • Varies between – 1.00 and +1.00 • The higher the absolute value of the correlation coefficient, the stronger the level of association • Positive sign – direct relationship • Negative sign – inverse relationship • Check for significance – p < .05 good, but consider “marginal” cases

  9. Strength of Correlation Coefficients

  10. Assumptions for Pearson’s Correlation Coefficient • The two variables are assumed to have been measured using interval or ratio-scaled measures • Nature of the relationship to be measured is linear (NOT curvilinear or other) • Variables come from normally distributed populations

  11. SPSS Pearson Correlation Example

  12. Substantive Significance • Coefficient of Determination (R-squared) measures the proportion of variation in one variable accounted for by another • The r2 measure is a percentage varying from 0.0 to 1.00 • The larger the r2, the stronger the linear relationship between the two variables under study • Explained and unexplained variation

  13. Relationships Between Variables Measured with Rank (Ordinal) Scales Spearman Rank Order Correlation Coefficient is a statistical measure of the linear association between two variables where both have been measured using ordinal (rank order) scales

  14. SPSS Spearman Rank Order Correlation

  15. SPSS Median Example for Restaurant Selection Factors

  16. Regression Analysis • A method for arriving at more mathematically detailed relationships (predictions) than those provided by the correlation coefficient • Assumptions • Variables are measured on interval or ratio scales • Variables come from a normal population • Error terms are normally and independently distributed

  17. Straight Line Relationship in Regression

  18. Formula for a Straight Regression Line y = a + bX + ei y = the dependent variable a = the intercept b = the slope X = the independent variable used to predict y ei = the error for the prediction

  19. Fitting the Regression Line Using the “Least Squares” Procedure

  20. Ordinary Least Squares (OLS) OLS is a statistical procedure that estimates regression equation Coefficients ( ‘a’ and ‘b’) which produce the lowest sum of squared differences between the actual and predicted values of the dependent variable

  21. Key Terms in Regression Analysis • Adjusted R-square • Explained vs. Unexplained variance • Regression coefficient • F-Test • Regression Significance • Coefficient Significance

  22. SPSS Results for Bivariate Regression

  23. Significance of Regression Coefficients • Answers these questions: • Is there a relationship between the dependent and independent variable? • How strong is the relationship? • p- or t- values are indicators of relationship strength and significance • Look for p < .05, but consider “marginal” cases

  24. Multiple Regression Multiple regression analysis is a statistical technique which analyzes the linear relationship between a dependent variable and multiple independent variables by estimating coefficients for the equation of a straight line

  25. SPSS Multiple Regression Example

  26. Beta Coefficient A StandardizedBeta Coefficientis an estimated regression coefficient that has been recalculated to have a mean of 0 and a standard deviation of 1 in order to enable independent variables with different units of measurement to be directly compared on their association with the dependent variable

  27. Evaluating a Regression Analysis • Assess the statistical significance of the overall regression model using the F statistic and its associated p-value • Evaluate the regression’s R-squared • Examine the individual regression coefficients and their t-test statistics (or their p-values) to see which are statistically significant • Look at values of the beta coefficients to assess relative influence of each predictor (IV) on Dependent Variable (DV)

  28. Multicollinearity Multicollinearity is when several independent variables are highly correlated with each other. This causes difficulty in accurately estimating the regression coefficients of the correlated variables.

  29. Multicollinearity – How to Avoid • Eliminate or replace highly correlated IVs • Perform a correlation matrix • Typically search for correlations higher than .5 in absolute value • Factor Analysis Techniques (also called “Principal Components Analysis”) • “Live with it” 

More Related