1 SPSS output & analysis. 2 The Regression Equation A line in a two dimensional or two-variable space is defined by the equation Y=a+b*X The Y variable.

Slides:



Advertisements
Similar presentations
“Students” t-test.
Advertisements

Statistics for the Social Sciences
Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Multiple Regression and Model Building
Tests of Significance for Regression & Correlation b* will equal the population parameter of the slope rather thanbecause beta has another meaning with.
Independent t -test Features: One Independent Variable Two Groups, or Levels of the Independent Variable Independent Samples (Between-Groups): the two.
1 SSS II Lecture 1: Correlation and Regression Graduate School 2008/2009 Social Science Statistics II Gwilym Pryce
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Multiple Regression Analysis
Inference for Regression 1Section 13.3, Page 284.
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
PSY 307 – Statistics for the Behavioral Sciences
Chapter 10 Simple Regression.
Bivariate Regression CJ 526 Statistical Analysis in Criminal Justice.
DATA ANALYSIS I MKT525. Plan of analysis What decision must be made? What are research objectives? What do you have to know to reach those objectives?
Chapter 11 Multiple Regression.
Chapter 14 Introduction to Linear Regression and Correlation Analysis
Multiple Regression Research Methods and Statistics.
Simple Linear Regression Analysis
Hypothesis Testing and T-Tests. Hypothesis Tests Related to Differences Copyright © 2009 Pearson Education, Inc. Chapter Tests of Differences One.
Example of Simple and Multiple Regression
Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, November 19 Chi-Squared Test of Independence.
AM Recitation 2/10/11.
ANCOVA Lecture 9 Andrew Ainsworth. What is ANCOVA?
Introduction to Linear Regression and Correlation Analysis
Correlation and Linear Regression
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Introduction to Regression Analysis. Two Purposes Explanation –Explain (or account for) the variance in a variable (e.g., explain why children’s test.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Chapter 10: Analyzing Experimental Data Inferential statistics are used to determine whether the independent variable had an effect on the dependent variance.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
PY 603 – Advanced Statistics II TR 12:30-1:45pm 232 Gordon Palmer Hall Jamie DeCoster.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
I271B The t distribution and the independent sample t-test.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
Introduction to the t Test Part 1: One-sample t test
Applied Quantitative Analysis and Practices LECTURE#25 By Dr. Osman Sadiq Paracha.
Introducing Communication Research 2e © 2014 SAGE Publications Chapter Seven Generalizing From Research Results: Inferential Statistics.
- We have samples for each of two conditions. We provide an answer for “Are the two sample means significantly different from each other, or could both.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Lesson Testing the Significance of the Least Squares Regression Model.
Educational Research Inferential Statistics Chapter th Chapter 12- 8th Gay and Airasian.
Correlation and Simple Linear Regression
Chapter 8 Hypothesis Testing with Two Samples.
Hypothesis Tests: One Sample
Correlation and Simple Linear Regression
Hypothesis Tests for a Population Mean in Practice
The t distribution and the independent sample t-test
Hypothesis testing and Estimation
Correlation and Simple Linear Regression
12 Inferential Analysis.
UNDERSTANDING RESEARCH RESULTS: STATISTICAL INFERENCE
Simple Linear Regression and Correlation
Reasoning in Psychology Using Statistics
Reasoning in Psychology Using Statistics
Statistical Inference for the Mean: t-test
MGS 3100 Business Analysis Regression Feb 18, 2016
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

1 SPSS output & analysis

2 The Regression Equation A line in a two dimensional or two-variable space is defined by the equation Y=a+b*X The Y variable can be expressed in terms of a constant (a) and a slope (b) times the X variable. The constant is also referred to as the intercept, and the slope as the regression coefficient or B coefficient Slope is the rate of change y-intercept is the starting value

3 Multivariate regression - CEO For multiple independent variables, the equation takes the form Y = a + b 1 *X 1 + b 2 *X bp*Xp SPSS can calculate the values in a bivariate or multivariate regression SPSS multiple regression analysis CEO file Dependent variable – Total compensation Independent variables Sales Profits Assets

4 Descriptive Statistics The variance is a measure of how spread out a distribution is. It is computed as the average squared deviation of each number from its mean. For example, for the numbers 1, 2, and 3, the mean is 2 and the variance is: σ 2 = The formula for the variance in a population is where μ is the mean and N is the number of scores.

5 Standard Deviation The formula for the standard deviation is square root of the variance It is the most commonly used measure of spread. In a normal distribution, about 68% of the scores are within one standard deviation of the mean and about 95% of the scores are within two standards deviations of the mean

6 Further Output Correlations Variables entered/removed Model Summary R = multiple correlation coefficient R Square = the effect size Adjusted R Square = the estimate of the proportion of variance accounted for by the regression i.e. it adjusts for degrees of freedom in the model Standard error The standard error of a statistic is the standard deviation of the sampling distribution of that statistic Standard errors are important because they reflect how much sampling fluctuation a statistic will show

7 ANOVA (Analysis of Variance) Df = Degrees of freedom F-statistic Significance level An F-test is used to test if the standard deviations of two populations are equal. Test can be a two-tailed test or a one-tailed test. Two-tailed version tests against the alternative that the standard deviations are not equal. One-tailed version only tests in one direction, that is the standard deviation from the first population is either greater than or less than (but not both) the second population standard deviation. Choice is determined by the problem. E.g. if we are testing a new process, we may only be interested in knowing if the new process is less variable than the old process

8 Significance Level – hypothesis testing Significance level is the criterion used for rejecting the null hypothesis Process: difference between the results of the experiment and the null hypothesis is determined assuming the null hypothesis is true, the probability of a difference that large or larger is computed this probability is compared to the significance level If the probability is less than or equal to the significance level, then the null hypothesis is rejected and the outcome is statistically significant Usually employ either the.05 level (sometimes called the 5% level) or the.01 level (1% level), although the choice of levels is largely subjective. The lower the significance level, the more the data must diverge from the null hypothesis to be significant. Therefore, the.01 level is more conservative than the.05 level

9 Coefficients output B = constant and regression coefficients for determining the regression equation Beta = standardized coefficients when all variables are expressed in standardized form T and sig. = t-tests and p-values for the regression coefficients and the constant This is the key to the regression analysis Casewise diagnostics/residuals/charts

10 Example use of model Y = a + b 1 *X 1 + b 2 *X 2 + b 3 *X 3 Where Sales = Profits = 5372 Assets = Totcomp = (57813*0.009) + (5372*2.638) + (59920*0.007) Totcomp = Totcomp = Actual Totcomp = Difference = Residual