Download presentation

Presentation is loading. Please wait.

Published byAbigail Walker Modified over 3 years ago

1
1 SPSS output & analysis

2
2 The Regression Equation A line in a two dimensional or two-variable space is defined by the equation Y=a+b*X The Y variable can be expressed in terms of a constant (a) and a slope (b) times the X variable. The constant is also referred to as the intercept, and the slope as the regression coefficient or B coefficient Slope is the rate of change y-intercept is the starting value

3
3 Multivariate regression - CEO For multiple independent variables, the equation takes the form Y = a + b 1 *X 1 + b 2 *X 2 +... + bp*Xp SPSS can calculate the values in a bivariate or multivariate regression SPSS multiple regression analysis CEO file Dependent variable – Total compensation Independent variables Sales Profits Assets

4
4 Descriptive Statistics The variance is a measure of how spread out a distribution is. It is computed as the average squared deviation of each number from its mean. For example, for the numbers 1, 2, and 3, the mean is 2 and the variance is: σ 2 = The formula for the variance in a population is where μ is the mean and N is the number of scores.

5
5 Standard Deviation The formula for the standard deviation is square root of the variance It is the most commonly used measure of spread. In a normal distribution, about 68% of the scores are within one standard deviation of the mean and about 95% of the scores are within two standards deviations of the mean

6
6 Further Output Correlations Variables entered/removed Model Summary R = multiple correlation coefficient R Square = the effect size Adjusted R Square = the estimate of the proportion of variance accounted for by the regression i.e. it adjusts for degrees of freedom in the model Standard error The standard error of a statistic is the standard deviation of the sampling distribution of that statistic Standard errors are important because they reflect how much sampling fluctuation a statistic will show

7
7 ANOVA (Analysis of Variance) Df = Degrees of freedom F-statistic Significance level An F-test is used to test if the standard deviations of two populations are equal. Test can be a two-tailed test or a one-tailed test. Two-tailed version tests against the alternative that the standard deviations are not equal. One-tailed version only tests in one direction, that is the standard deviation from the first population is either greater than or less than (but not both) the second population standard deviation. Choice is determined by the problem. E.g. if we are testing a new process, we may only be interested in knowing if the new process is less variable than the old process

8
8 Significance Level – hypothesis testing Significance level is the criterion used for rejecting the null hypothesis Process: difference between the results of the experiment and the null hypothesis is determined assuming the null hypothesis is true, the probability of a difference that large or larger is computed this probability is compared to the significance level If the probability is less than or equal to the significance level, then the null hypothesis is rejected and the outcome is statistically significant Usually employ either the.05 level (sometimes called the 5% level) or the.01 level (1% level), although the choice of levels is largely subjective. The lower the significance level, the more the data must diverge from the null hypothesis to be significant. Therefore, the.01 level is more conservative than the.05 level

9
9 Coefficients output B = constant and regression coefficients for determining the regression equation Beta = standardized coefficients when all variables are expressed in standardized form T and sig. = t-tests and p-values for the regression coefficients and the constant This is the key to the regression analysis Casewise diagnostics/residuals/charts

10
10 Example use of model Y = a + b 1 *X 1 + b 2 *X 2 + b 3 *X 3 Where Sales = 57813 Profits = 5372 Assets = 59920 Totcomp = 6189.062 + (57813*0.009) + (5372*2.638) + (59920*0.007) Totcomp = 6189.062 + 520.317 + 14171.336 + 419.44 Totcomp = 21300.155 Actual Totcomp = 24424 Difference = 3123.845 Residual

Similar presentations

Presentation is loading. Please wait....

OK

Example of Simple and Multiple Regression

Example of Simple and Multiple Regression

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on review of related literature sample Ppt on human nutrition and digestion quiz Ppt on second law of thermodynamics biology Ppt on central administrative tribunal bench Ppt on forensic science laboratory Ppt on disk formatting fails Ppt on forward rate agreement risk Ppt on depreciation of indian rupee 2013 Ppt on service oriented architecture training Ppt on cartesian product examples