EQT 272 PROBABILITY AND STATISTICS

Slides:



Advertisements
Similar presentations
Chapter 12 Simple Linear Regression
Advertisements

Forecasting Using the Simple Linear Regression Model and Correlation
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Correlation and Regression
© The McGraw-Hill Companies, Inc., 2000 CorrelationandRegression Further Mathematics - CORE.
Chapter 12 Simple Linear Regression
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
The Simple Regression Model
SIMPLE LINEAR REGRESSION
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Chapter Topics Types of Regression Models
Introduction to Probability and Statistics Linear Regression and Correlation.
SIMPLE LINEAR REGRESSION
Korelasi dalam Regresi Linear Sederhana Pertemuan 03 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Simple Linear Regression and Correlation
Chapter 7 Forecasting with Simple Regression
Linear Regression/Correlation
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Lecture 5 Correlation and Regression
Correlation and Regression
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Ms. Khatijahhusna Abd Rani School of Electrical System Engineering Sem II 2014/2015.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
1 1 Slide © 2004 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
Chapter 3: Introductory Linear Regression
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
© The McGraw-Hill Companies, Inc., Chapter 11 Correlation and Regression.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
CHAPTER 3 INTRODUCTORY LINEAR REGRESSION. Introduction  Linear regression is a study on the linear relationship between two variables. This is done by.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Chapter 13 Multiple Regression
Chapter Thirteen Copyright © 2006 John Wiley & Sons, Inc. Bivariate Correlation and Regression.
C HAPTER 4: I NTRODUCTORY L INEAR R EGRESSION Chapter Outline 4.1Simple Linear Regression Scatter Plot/Diagram Simple Linear Regression Model 4.2Curve.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics Seventh Edition By Brase and Brase Prepared by: Lynn Smith.
Chapter 5: Introductory Linear Regression
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Free Powerpoint Templates ROHANA BINTI ABDUL HAMID INSTITUT E FOR ENGINEERING MATHEMATICS (IMK) UNIVERSITI MALAYSIA PERLIS.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 19 Measure of Variation in the Simple Linear Regression Model (Data)Data.
Free Powerpoint Templates ROHANA BINTI ABDUL HAMID INSTITUT E FOR ENGINEERING MATHEMATICS (IMK) UNIVERSITI MALAYSIA PERLIS.
Chapter 5: Introductory Linear Regression. INTRODUCTION TO LINEAR REGRESSION Regression – is a statistical procedure for establishing the relationship.
REGRESSION AND CORRELATION SIMPLE LINEAR REGRESSION 10.2 SCATTER DIAGRAM 10.3 GRAPHICAL METHOD FOR DETERMINING REGRESSION 10.4 LEAST SQUARE METHOD.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 9 l Simple Linear Regression 9.1 Simple Linear Regression 9.2 Scatter Diagram 9.3 Graphical.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Regression and Correlation
Statistics for Managers using Microsoft Excel 3rd Edition
Chapter 5 STATISTICS (PART 4).
SIMPLE LINEAR REGRESSION MODEL
Simple Linear Regression
Statistics for Business and Economics (13e)
Quantitative Methods Simple Regression.
Correlation and Regression
Correlation and Regression
SIMPLE LINEAR REGRESSION
Introduction to Regression
St. Edward’s University
Presentation transcript:

EQT 272 PROBABILITY AND STATISTICS ROHANA BINTI ABDUL HAMID INSTITUT E FOR ENGINEERING MATHEMATICS (IMK) UNIVERSITI MALAYSIA PERLIS Free Powerpoint Templates

INTRODUCTION TO LINEAR REGRESSION CHAPTER 5: INTRODUCTION TO LINEAR REGRESSION

CHAPTER OUTLINE: 5.1 INTRODUCTION 5.2 SCATTER PLOTS 5.3 LINEAR REGRESSION MODEL 5.4 LEAST SQUARE METHOD 5.5 COEFFICIENT DETERMINATION 5.6 CORRELATION 5.7 TEST OF SIGNIFICANCE 5.8 ANALYSIS OF VARIANCE (ANOVA)

5.1 INTRODUCTION TO REGRESSION Regression – is a statistical procedure for establishing the r/ship between 2 or more variables. This is done by fitting a linear equation to the observed data. The regression line is then used by the researcher to see the trend and make prediction of values for the data. There are 2 types of relationship: Simple ( 2 variables) Multiple (more than 2 variables)

THE SIMPLE LINEAR REGRESSION MODEL is an equation that describes a dependent variable (Y) in terms of an independent variable (X) plus random error where, = intercept of the line with the Y-axis = slope of the line = random error Random error, is the difference of data point from the deterministic value. This regression line is estimated from the data collected by fitting a straight line to the data set and getting the equation of the straight line,

Example 5.1: 1) A nutritionist studying weight loss programs might wants to find out if reducing intake of carbohydrate can help a person reduce weight. a) X is the carbohydrate intake (independent variable). b) Y is the weight (dependent variable). 2) An entrepreneur might want to know whether increasing the cost of packaging his new product will have an effect on the sales volume. a) X is cost b) Y is sales volume

5.2 SCATTER PLOTS A scatter plot is a graph or ordered pairs (x,y). The purpose of scatter plot – to describe the nature of the relationships between independent variable, X and dependent variable, Y in visual way. The independent variable, x is plotted on the horizontal axis and the dependent variable, y is plotted on the vertical axis.

SCATTER DIAGRAM Positive Linear Relationship E(y) Regression line x Regression line Intercept b0 Slope b1 is positive

SCATTER DIAGRAM Negative Linear Relationship E(y) Intercept b0 x Intercept b0 Regression line Slope b1 is negative

SCATTER DIAGRAM No Relationship E(y) Intercept Regression line b0 x Intercept b0 Regression line Slope b1 is 0

5.3 LINEAR REGRESSION MODEL A linear regression can be develop by freehand plot of the data. Example 5.2: The given table contains values for 2 variables, X and Y. Plot the given data and make a freehand estimated regression line.

5.4 LEAST SQUARES METHOD The least squares method is commonly used to determine values for and that ensure a best fit for the estimated regression line to the sample data points The straight line fitted to the data set is the line:

LEAST SQUARES METHOD Theorem 10.1: Given the sample data , the coefficients of the least squares line are: y-Intercept for the Estimated Regression Equation, and are the mean of x and y respectively.

LEAST SQUARES METHOD ii) Slope for the Estimated Regression Equation, Where,

LEAST SQUARES METHOD Given any value of the predicted value of the dependent variable , can be found by substituting into the equation

Example 5.3: Students score in history The data below represent scores obtained by ten primary school students before and after they were taken on a tour to the museum (which is supposed to increase their interest in history) Before,x 65 63 76 46 68 72 57 36 96 After, y 66 86 48 71 42 87 Fit a linear regression model with “before” as the explanatory variable and “after” as the dependent variable. Predict the score a student would obtain “after” if he scored 60 marks “before”.

5.5 COEFFICIENT OF DETERMINATION( ) The coefficient of determination is a measure of the variation of the dependent variable (Y) that is explained by the regression line and the independent variable (X). The symbol for the coefficient of determination is or . If =0.90, then =0.81. It means that 81% of the variation in the dependent variable (Y) is accounted for by the variations in the independent variable (X). The rest of the variation, 0.19 or 19%, is unexplained and called the coefficient of nondetermination. Formula for the coefficient of nondetermination is

COEFFICIENT OF DETERMINATION( ) Relationship Among SST, SSR, SSE SST = SSR + SSE where: SST = total sum of squares SSR = sum of squares due to regression SSE = sum of squares due to error The coefficient of determination is: where: SSR = sum of squares due to regression SST = total sum of squares

Example 5.4 If =0.919, find the value for and explain the value. Solution : = 0.84. It means that 84% of the variation in the dependent variable (Y) is explained by the variations in the independent variable (X).

5.6 CORRELATION (r) Correlation measures the strength of a linear relationship between the two variables. Also known as Pearson’s product moment coefficient of correlation. The symbol for the sample coefficient of correlation is , population . Formula :

Properties of : Values of close to 1 implies there is a strong positive linear relationship between x and y. Values of close to -1 implies there is a strong negative linear relationship between x and y. Values of close to 0 implies little or no linear relationship between x and y.

c)Calculate the value of r and interpret its meaning Refer Example 5.3: Students score in history c)Calculate the value of r and interpret its meaning Solution: Thus, there is a strong positive linear relationship between score obtain before (x) and after (y).

5.7 TEST OF SIGNIFICANCE To determine whether X provides information in predicting Y, we proceed with testing the hypothesis. Two test are commonly used: i) ii) t Test F Test

1) t-Test 1. Determine the hypotheses. ( no linear r/ship) (exist linear r/ship) 2. Compute Critical Value/ level of significance. / p-value 3. Compute the test statistic.

1) t-Test Reject H0 if : t < - or t > p-value < a 4. Determine the Rejection Rule. Reject H0 if : t < - or t > p-value < a 5.Conclusion. There is a significant relationship between variable X and Y.

2) F-Test 1. Determine the hypotheses. ( no linear r/ship) (exist linear r/ship) 2. Specify the level of significance. Fa with degree of freedom (df) in the numerator (1) and degrees of freedom (df) in the denominator (n-2) 3. Compute the test statistic. F = MSR/MSE 4. Determine the Rejection Rule. Reject H0 if : p-value < a F test >

2) F-Test There is a significant relationship between 5.Conclusion. There is a significant relationship between variable X and Y.

Refer Example 5.3: Students score in history d) Test to determine if their scores before and after the trip is related. Use a=0.05 Solution: 1. ( no linear r/ship) (exist linear r/ship) 2. 3.

4. Rejection Rule: 5. Conclusion: Thus, we reject H0 4. Rejection Rule: 5. Conclusion: Thus, we reject H0. The score before (x) is linear relationship to the score after (y) the trip.

5.8 ANALYSIS OF VARIANCE (ANOVA) The value of the test statistic F for an ANOVA test is calculated as: F=MSR MSE To calculate MSR and MSE, first compute the regression sum of squares (SSR) and the error sum of squares (SSE).

ANALYSIS OF VARIANCE (ANOVA) General form of ANOVA table: ANOVA Test 1) Hypothesis: 2) Select the distribution to use: F-distribution 3) Calculate the value of the test statistic: F 4) Determine rejection and non rejection regions: 5) Make a decision: Reject Ho/ accept H0 Source of Variation Degrees of Freedom(df) Sum of Squares Mean Squares Value of the Test Statistic Regression 1 SSR MSR=SSR/1 F=MSR MSE Error n-2 SSE MSE=SSE/n-2 Total n-1 SST

Example 5.5 The manufacturer of Cardio Glide exercise equipment wants to study the relationship between the number of months since the glide was purchased and the length of time the equipment was used last week. Determine the regression equation. At , test whether there is a linear relationship between the variables

Solution (1): Regression equation:

Solution (2): Hypothesis: F-distribution table: Test Statistic: F = MSR/MSE = 17.303 or using p-value approach: significant value =0.003 Rejection region: Since F statistic > F table (17.303>11.2586 ), we reject H0 or since p-value (0.003 < 0.01 )we reject H0 5) Thus, there is a linear relationship between the variables (month X and hours Y).