Presentation is loading. Please wait.

Presentation is loading. Please wait.

Determining and Interpreting Associations Among Variables

Similar presentations


Presentation on theme: "Determining and Interpreting Associations Among Variables"— Presentation transcript:

1 Determining and Interpreting Associations Among Variables

2 Associative Analyses Associative analyses: determine where stable relationships exist between two variables Examples What methods of doing business are associated with level of customer satisfaction? What demographic variables are associated with repeat buying of Brand A? Is type of sales training associated with sales performance of sales representatives? Are purchase intention scores of a new product associated with actual sales of the product? Ch 18

3 Relationships Between Two Variables
Relationship: a consistent, systematic linkage between the levels or labels for two variables “Levels” refers to the characteristics of description for interval or ratio scales…the level of temperature, etc. “Labels” refers to the characteristics of description for nominal or ordinal scales, buyers v. non-buyers, etc. As we shall see, this concept is important in understanding the type of relationship… Ch 18

4 Relationships Between Two Variables
Nonmonotonic: two variables are associated, but only in a very general sense; don’t know “direction” of relationship, but we do know that the presence (or absence) of one variable is associated with the presence (or absence) of another. At the presence of breakfast, we shall have the presence of orders for coffee. At the presence of lunch, we shall have the absence of orders for coffee. Ch 18

5 Nonmonotonic Relationship
Ch 18

6 Relationships Between Two Variables
Monotonic: the general direction of a relationship between two variables is known Increasing Decreasing Shoe store managers know that there is an association between the age of a child and shoe size. The older a child, the larger the shoe size. The direction is increasing, though we only know general direction, not actual size. Ch 18

7 Monotonic Increasing Relationship
Ch 18

8 Relationships Between Two Variables
Linear: “straight-line” association between two variables Here knowledge of one variable will yield knowledge of another variable “100 customers produce $500 in revenue at Jack-in-the-Box” (p. 525) Ch 18

9 Relationships Between Two Variables
Curvilinear: some smooth curve pattern describes the association Example: Research shows that job satisfaction is high when one first starts to work for a company but goes down after a few years and then back up after workers have been with the same company for many years. This would be a U-shaped relationship. Ch 18

10 Characterizing Relationships Between Variables
Presence: whether any systematic relationship exists between two variables of interest Direction: whether the relationship is positive or negative Strength of association: how strong the relationship is: strong? moderate? weak? Assess relationships in the order shown above. Ch 18

11 Cross-Tabulations Cross-tabulation: consists of rows and columns defined by the categories classifying each variable…used for nonmonotonic relationships Cross-tabulation table: four types of numbers in each cell Frequency Raw percentage Column percentage Row percentage Ch 18

12 Cross-Tabulations Using SPSS, commands are ANALYZE, DESCRIPTIVE STATISTICS, CROSSTABS You will find a detailed discussion of cross-tabulation tables in your text, pages Ch 18

13 Cross-Tabulations Ch 18

14 Cross-Tabulations When we have two nominal-scaled variables and we want to know if they are associated, we use cross-tabulations to examine the relationship and the Chi-Square test to test for presence of a systematic relationship. In this situation: two variables, both with nominal scales, we are testing for a nonmonotonic relationship. Ch 18

15 Chi-Square Analysis Chi-square (X2) analysis: is the examination of frequencies for two nominal-scaled variables in a cross-tabulation table to determine whether the variables have a significant relationship. The null hypothesis is that the two variables are not related. Observed and expected frequencies: Ch 18

16 Cross-Tabulations Example: Let’s suppose we want to know if there is a relationship between studying and test performance and both of these variables are measured using nominal scales… Ch 18

17 Interpreting a Significant Cross-Tabulation Finding
If the chi-square analysis determines that you have a significant relationship (no support for the null hypothesis) you may use the following to determine the nature of the relationship: The column percentages table or The raw percentages table Ch 18

18 Cross-Tabulations Did you study for the midterm test? __yes __no
How did you perform on the midterm test? __pass __fail Now, let’s look at the data in a crosstabulation table: Ch 18

19 Cross-Tabulations Do you “see” a relationship? Do you “see” the “presence” of studying with the “presence” of passing? Do you “see” the “absence” of passing with the presence of not studying? Congratulations! You have just “seen” a nonmonotonic relationship. Ch 18

20 Cross-Tabulations Bar charts can be used to “see” nonmonotonic relationships. Ch 18

21 Cross-Tabulations But while we can “see” this association, how do we know there is the presence of a systematic association? In other words, is this association statistically significant? Would it likely appear again and again if we sampled other students? We use the Chi-Square test to tell us if nonmonotonic relationships are really present. Ch 18

22 Cross-Tabulations Using SPSS, commands are ANALYZE, DESCRIPTIVE STATISTICS, CROSSTABS and within the CROSSTABS dialog box, STATISTICS, CHI-SQUARE. Ch 18

23 Chi-Square Analysis Chi-square analysis: assesses nonmonotonic associations in cross-tabulation tables and is based upon differences between observed and expected frequencies Observed frequencies: counts for each cell found in the sample Expected frequencies: calculated on the null of “no association” between the two variables under examination Ch 18

24 Chi-Square Analysis Computed Chi-Square values: Ch 18

25 Chi-Square Analysis The chi-square distribution’s shape changes depending on the number of degrees of freedom The computed chi-square value is compared to a table value to determine statistical significance Ch 18

26 Chi-Square Analysis How do I interpret a Chi-square result?
The chi-square analysis yields the probability that the researcher would find evidence in support of the null hypothesis if he or she repeated the study many, many times with independent samples. If the P value is < or = to 0.05, this means there is little support for the null hypothesis (no association). Therefore, we have a significant association…we have the PRESENCE of a systematic relationship between the two variables. Ch 18

27 Chi-Square Analysis Read the P value (Asympt. Sig) across from Pearson Chi-Square. Since the P value is <0.05, we have a SIGNIFICANT association. Ch 18

28 Chi-Square Analysis How do I interpret a Chi-square result?
A significant chi-square result means the researcher should look at the cross-tabulation row and column percentages to “see” the association pattern SPSS will calculate row, column, (or both) percentages for you. See the CELLS box at the bottom of the CROSSTABS dialog box. Ch 18

29 Chi-Square Analysis Look at the ROW %’s: 92% of those who studied passed; almost 70% of those who didn’t study failed. “See” the relationship! Ch 18

30 Presence, Direction and Strength
Presence? Yes, our Chi-Square was significant. This means that the pattern we observe between studying/not studying and passing/failing is a systematic relationship if we ran our study many, many times. Direction? Nonmonotonic relationships do not have direction…only presence and absence. Ch 18

31 Presence, Direction and Strength
Strength? Since the Chi-Square only tells us presence, you must judge the strength by looking at the pattern. Don’t you think there is a “strong” relationship between study/not studying and passing/failing? Ch 18

32 When can you use Crosstabs and Chi-Square test?
When you want to know if there is an association between two variables and… Both of those variables have nominal (or ordinal) scales Ch 18

33 Ch 18

34 Ch 18

35 Correlation Coefficients and Covariation
The correlation coefficient: is an index number, constrained to fall between the range of −1.0 and +1.0. The correlation coefficient communicates both the strength and the direction of the linear relationship between two metric variables. Ch 18

36 Correlation Coefficients and Covariation
The amount of linear relationship between two variables is communicated by the absolute size of the correlation coefficient. The direction of the association is communicated by the sign (+, -) of the correlation coefficient. Covariation: is defined as the amount of change in one variable systematically associated with a change in another variable. Ch 18

37 Measuring the Association Between Interval- or Ratio-Scaled Variables
In this case, we are trying to assess presence, direction and strength of a monotonic relationship. We are aided in doing this by using: Using SPSS, commands are ANALYZE, CORRELATE, BIVARIATE. Pearson Product Moment Correlation Ch 18

38 Correlation Coefficients and Covariation
Covariation can be examined with use of a scatter diagram. Ch 18

39 Pearson Product Moment Correlation Coefficient (r)
Presence? Determine if there is a significant association. The P value should be examined FIRST! If it is significant, there is a significant association. If not, there is no association. Direction? Look at the coefficient. Is it positive or negative? Ch 18

40 Pearson Product Moment Correlation Coefficient (r)
Strength? The correlation coefficient (r) is a number ranging from -1.0 to the closer to 1.00 (+ or -), the stronger the association. There are “rules of thumb”… Ch 18

41 Rules of Thumb Determining Strength of Association
A correlation coefficient’s size indicates the strength of association between two variables. The sign (+ or -) indicates the direction of the association Ch 18

42 Pearson Product Moment Correlation Coefficient (r)
Pearson product moment correlation: measures the degree of linear association between the two variables. Ch 18

43 Pearson Product Moment Correlation Coefficient (r)
Special considerations in linear procedures: Correlation takes into account only the relationship between two variables, not interaction with other variables. Correlation does not demonstrate cause and effect. Correlations will not detect non-linear relationships between variables. Ch 18

44 When there is NO association, the P value for the Pearson r will be >0.05.
Ch 18

45 When there IS association, the P value for the Pearson r will be < or =0.05.
Examples: negative association between sales force rewards and turnover; positive association between length of sales force training and sales. Ch 18

46 Example What items are associated with preference for a waterfront view among restaurant patrons? Are preferences for unusual entrées, simple décor, and unusual desserts associated with preference for waterfront view while dining? Since all of these variables are interval-scaled we can run a Pearson Correlation to determine the association between each variable with the preference for waterfront view. Ch 18

47 Using SPSS, commands are ANALYZE, CORRELATE, BIVARIATE.
Ch 18

48 The output shows presence, direction and strength of the association.
Do you see any managerial significance to these associations? Ch 18

49 Concluding Remarks on Associative Analyses
Researchers will always test the null hypothesis of NO relationship or no correlation. When the null hypothesis is rejected, then the researcher may have a managerially important relationship to share with the manager. Ch 18


Download ppt "Determining and Interpreting Associations Among Variables"

Similar presentations


Ads by Google