Simple Linear Regression and Correlation (Part II) By Asst. Prof. Dr. Min Aung.

Slides:



Advertisements
Similar presentations
Simple Linear Regression and Correlation by Asst. Prof. Dr. Min Aung.
Advertisements

Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Forecasting Using the Simple Linear Regression Model and Correlation
Hypothesis Testing Steps in Hypothesis Testing:
1 Functions and Applications
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Chapter 12 Simple Linear Regression
9. SIMPLE LINEAR REGESSION AND CORRELATION
SIMPLE LINEAR REGRESSION
Chapter Topics Types of Regression Models
Introduction to Probability and Statistics Linear Regression and Correlation.
SIMPLE LINEAR REGRESSION
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
1 732G21/732A35/732G28. Formal statement  Y i is i th response value  β 0 β 1 model parameters, regression parameters (intercept, slope)  X i is i.
Simple Linear Regression and Correlation
Chapter 7 Forecasting with Simple Regression
Lecture 5 Correlation and Regression
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Simple Linear Regression Models
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
EQT 272 PROBABILITY AND STATISTICS
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics
AGENDA I.Homework 3 II.Parameter Estimates Equations III.Coefficient of Determination (R 2 ) Formula IV.Overall Model Test (F Test for Regression)
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
Anthony Greene1 Regression Using Correlation To Make Predictions.
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Introduction to Probability and Statistics Thirteenth Edition Chapter 12 Linear Regression and Correlation.
Chapter 13 Multiple Regression
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
Correlation and Regression: The Need to Knows Correlation is a statistical technique: tells you if scores on variable X are related to scores on variable.
Residuals Recall that the vertical distances from the points to the least-squares regression line are as small as possible.  Because those vertical distances.
Chapter 8 – 1 Regression & Correlation:Extended Treatment Overview The Scatter Diagram Bivariate Linear Regression Prediction Error Coefficient of Determination.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics Seventh Edition By Brase and Brase Prepared by: Lynn Smith.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
Chapter 16 Multiple Regression and Correlation
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Free Powerpoint Templates ROHANA BINTI ABDUL HAMID INSTITUT E FOR ENGINEERING MATHEMATICS (IMK) UNIVERSITI MALAYSIA PERLIS.
AP STATISTICS LESSON 3 – 3 (DAY 2) The role of r 2 in regression.
Free Powerpoint Templates ROHANA BINTI ABDUL HAMID INSTITUT E FOR ENGINEERING MATHEMATICS (IMK) UNIVERSITI MALAYSIA PERLIS.
REGRESSION AND CORRELATION SIMPLE LINEAR REGRESSION 10.2 SCATTER DIAGRAM 10.3 GRAPHICAL METHOD FOR DETERMINING REGRESSION 10.4 LEAST SQUARE METHOD.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
Multiple Regression.
Regression and Correlation
Ch12.1 Simple Linear Regression
Essentials of Modern Business Statistics (7e)
Simple Linear Regression
Statistics for Business and Economics (13e)
Relationship with one independent variable
Quantitative Methods Simple Regression.
CHAPTER 10 Correlation and Regression (Objectives)
Multiple Regression.
Merve denizci nazlıgül, M.s.
Relationship with one independent variable
Correlation and Regression
SIMPLE LINEAR REGRESSION
SIMPLE LINEAR REGRESSION
St. Edward’s University
Presentation transcript:

Simple Linear Regression and Correlation (Part II) By Asst. Prof. Dr. Min Aung

Regression equation Regression equation = Least-squares equation = The best-fitting line along which all sample points are scattering = the straight line whose total squared vertical distance from all scatter points is minimum (Least-squared) Ŷ = a + bX Find A in calculator and press it. Find B in calculator and press it. B = Formula 1 (the first one) and A = Formula 1 (the second one) Ŷ is the point estimate for Y given by the regression equation

Regression Line (1) Regression line = Least-squares line Substitute the smallest X value in the regression equation Ŷ = a + bX and compute Ŷ. Then, you get a pair (smallest X, corresponding Ŷ). Substitute the largest X value in the regression equation Ŷ = a + bX and compute Ŷ. Then, you get a pair (largest X, corresponding Ŷ). Plot the two points (smallest X, corresponding Ŷ) and (largest X, corresponding Ŷ). Connect the two points by a straight line segment.

Regression Line (2) X Y 2 (4, 3) (2, 1) 4

Constant or Y-intercept In the regression equation, A is called constant or slope. A is the value of Ŷ when X = 0. Interpretation of A: If X is 0 unit, the estimated Y is A units. X Y 2 (4, 3) (2, 1) 4 (0, 1) 1 is called the y-intercept of the line Ŷ = X. 1 is called the constant of the equation Ŷ = X.

Regression Coefficient or Slope In the regression equation, B is called Regression Coefficient or Slope. B is the value by which Ŷ increases when X increases by 1 unit. Interpretation of B: If X increases by 1 unit, the estimated Y will increase by B units. X Y 2 (4, 3) (2, 2) is called the regression coefficient of the equation Ŷ = X and slope of the line. 0.5 is called the slope of the line with the equation Ŷ = X

Interval Estimates Compute S e by Formula 7, then use S e and Formula 4 to compute S b.  : b  tS b, where t is found at t-table, Df = n – 2, two-tailed

ANOVA Table An = Analysis, O = of,V = Variance  ANOVA SST = (Total variation of Y values from Ῡ) =  (Y - Ῡ) 2 SSR = (Total variation of Ŷ values from Ῡ) =  (Ŷ - Ῡ) 2 SSE = (Total variation of Y values from Ŷ) =  (Y -Ŷ) 2 Table Structure R E T SSDfMS F Formula 9: Denominator Formula 9: numerator SST - SSR n-1 1 n-2 = SSR = SSE  (n-2) MSR  MSE

Three Statistics from ANOVA Table F = MSR  MSE: The larger F is, the more likely is the regression model significant R 2 = SSR  SST : The larger R 2 is, the better can the regression model predict Y values S e =  MSE : The smaller S e is, the more precise is the interval estimates for Y values