Regression Analysis What is regression ?What is regression ? Best-fit lineBest-fit line Least squareLeast square What is regression ?What is regression.

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

Regresi Linear Sederhana Pertemuan 01 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
 Coefficient of Determination Section 4.3 Alan Craig
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Chapter 10 Regression. Defining Regression Simple linear regression features one independent variable and one dependent variable, as in correlation the.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Correlation-Regression The correlation coefficient measures how well one can predict X from Y or Y from X.
Business 205. Review Chi-Square Preview Regression.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Lecture 5: Simple Linear Regression
1 1 Slide Simple Linear Regression Chapter 14 BA 303 – Spring 2011.
Chapter 6 & 7 Linear Regression & Correlation
Anthony Greene1 Regression Using Correlation To Make Predictions.
Regression Maarten Buis Outline Recap Estimation Goodness of Fit Goodness of Fit versus Effect Size transformation of variables and effect.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Multivariate Analysis. One-way ANOVA Tests the difference in the means of 2 or more nominal groups Tests the difference in the means of 2 or more nominal.
Thomas Knotts. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR.
Regression Regression relationship = trend + scatter
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
Correlation and Regression Basic Concepts. An Example We can hypothesize that the value of a house increases as its size increases. Said differently,
Environmental Modeling Basic Testing Methods - Statistics III.
Chapter 10: Determining How Costs Behave 1 Horngren 13e.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
Regression Analysis Deterministic model No chance of an error in calculating y for a given x Probabilistic model chance of an error First order linear.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Correlation and Regression Basic Concepts. An Example We can hypothesize that the value of a house increases as its size increases. Said differently,
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
The simple linear regression model and parameter estimation
Regression and Correlation of Data Summary
Regression Analysis AGEC 784.
REGRESSION G&W p
LEAST – SQUARES REGRESSION
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
AP Statistics Chapter 14 Section 1.
Practice. Practice Practice Practice Practice r = X = 20 X2 = 120 Y = 19 Y2 = 123 XY = 72 N = 4 (4) 72.
Kin 304 Regression Linear Regression Least Sum of Squares
10.3 Coefficient of Determination and Standard Error of the Estimate
Ch12.1 Simple Linear Regression
Multiple Regression.
BPK 304W Regression Linear Regression Least Sum of Squares
Relationship with one independent variable
Chapter 15 Linear Regression
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
AP Stats: 3.3 Least-Squares Regression Line
LESSON 21: REGRESSION ANALYSIS
Linear Regression.
Regression Models - Introduction
Simple Linear Regression
^ y = a + bx Stats Chapter 5 - Least Squares Regression
Prediction of new observations
Linear regression Fitting a straight line to observations.
Correlation and Regression-III
Relationship with one independent variable
Section 2: Linear Regression.
Least-Squares Regression
Correlation and Regression
Simple Linear Regression
Multivariate Analysis Regression
Presentation transcript:

Regression Analysis What is regression ?What is regression ? Best-fit lineBest-fit line Least squareLeast square What is regression ?What is regression ? Best-fit lineBest-fit line Least squareLeast square

What is regression ? Study of the behavior of one variable in relation to several compartments induced by another variable.compartments By the use of regression line or equation, we can predict scores on the dependent variable from those of the independent variable. There are different nomenclatures of independent and dependent variables.nomenclatures

COMPARTMENTS NON CROSS CLASSIFIED AGEEDUCATION CROSS CLASSIFIED ELDER EDUCATED ELDER UNEDUCATED YOUNGER EDUCATED YOUNGER UNEDUCATED

NOMENCLATURE

2 - WAY TABLE

Regression lines (Best - fit line)

Change in best -fit line

Equation for a straight line Y = a + bX (simple regression) Y = a+ b1X1+b2X2+……..bnXn Y= Predicted score a = Intercept/origin of regression line b = regression coefficient representing unit of change in dependent variable with the increase in 1 unit on X variable

b coefficient estimation SumXY-[(sum X)(sum Y)/ N] SumXY-[(sum X)(sum Y)/ N] bxy = Sum Y2 - [(sum Y)^2/N] Sum Y2 - [(sum Y)^2/N] Sum of deviation XY Sum of deviation XY bxy = Sum of deviation X square Sum of deviation X square

Estimation of ‘a’ axy = Mean X - bxy(predicted Y ) Predicting by graph Predicting by graph

Least square Goal of linear regression procedure is to fit a line through the points. Specifically, the program will compute a line so that the squared deviations of the observed points from that line are minimized. This procedure is called least square. A fundamental principle of least square method is that variation on a dependent variable can be partitioned, or divided into parts, according to the sources of variation.

Least square estimation  ( Y - Y mean )^2 =  (Y hat - Y mean) ^2 +  (Y - Y hat) ^2 a) Total sum of squared deviations of the observed values (Y) on the dependent variable from the dependent variable mean (Total SS) b) Sum of squared deviations of the predicted values (Y hat) for the dependent variable from the dependent variable mean (Model SS) c) Sum of squared of the observed values on the dependent variable from the predicted values (Y -hat), that is, the sum of the squared residuals (Error SS)