Correlation and Regression

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

Correlation and regression
Chapter 12 Simple Linear Regression
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Chapter 10 Regression. Defining Regression Simple linear regression features one independent variable and one dependent variable, as in correlation the.
Regression Regression: Mathematical method for determining the best equation that reproduces a data set Linear Regression: Regression method applied with.
Least Square Regression
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Multiple Regression Research Methods and Statistics.
Correlation and Regression Analysis
Simple Linear Regression Analysis
Linear Regression Analysis
Correlation & Regression
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Introduction to Linear Regression and Correlation Analysis
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Section 10-3 Regression.
Chapter 11 Simple Regression
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
CHAPTER 3 INTRODUCTORY LINEAR REGRESSION. Introduction  Linear regression is a study on the linear relationship between two variables. This is done by.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
ECON 338/ENVR 305 CLICKER QUESTIONS Statistics – Question Set #8 (from Chapter 10)
AP STATISTICS LESSON 14 – 1 ( DAY 1 ) INFERENCE ABOUT THE MODEL.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
Method 3: Least squares regression. Another method for finding the equation of a straight line which is fitted to data is known as the method of least-squares.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
1 Objective Given two linearly correlated variables (x and y), find the linear function (equation) that best describes the trend. Section 10.3 Regression.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Research Methods: 2 M.Sc. Physiotherapy/Podiatry/Pain Correlation and Regression.
Chapter 12: Correlation and Linear Regression 1.
Lecture Slides Elementary Statistics Twelfth Edition
Linear Regression Essentials Line Basics y = mx + b vs. Definitions
Regression Analysis AGEC 784.
Part 5 - Chapter
Introduction to Regression Analysis
Part 5 - Chapter 17.
LEAST – SQUARES REGRESSION
Linear Regression Special Topics.
Kin 304 Regression Linear Regression Least Sum of Squares
SIMPLE LINEAR REGRESSION MODEL
BPK 304W Regression Linear Regression Least Sum of Squares
CORRELATION & LINEAR REGRESSION
BIVARIATE REGRESSION AND CORRELATION
CHAPTER 10 Correlation and Regression (Objectives)
Simple Linear Regression - Introduction
Correlation and Simple Linear Regression
Lecture Slides Elementary Statistics Thirteenth Edition
Correlation and Regression
Part 5 - Chapter 17.
No notecard for this quiz!!
^ y = a + bx Stats Chapter 5 - Least Squares Regression
The Weather Turbulence
Correlation and Simple Linear Regression
Relationship between two continuous variables: correlations and linear regression both continuous. Correlation – larger values of one variable correspond.
Least-Squares Regression
Correlation and Regression
SIMPLE LINEAR REGRESSION
SIMPLE LINEAR REGRESSION
Chapter 14 Inference for Regression
Ch 4.1 & 4.2 Two dimensions concept
Created by Erin Hodgess, Houston, Texas
Chapter 14 Multiple Regression
Regression and Correlation of Data
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Correlation and Regression Angelo Carrieri

Correlation Correlation provides a numerical measure of the linear or “straight-line” relationship between two continuous variables X and Y. The resulting correlation coefficient or “r value” is more formally known as the Pearson correlation coefficient after the mathematician who first described it. X is known as the independent or explanatory variable while Y is known as the dependent or response variable. A significant advantage of the correlation coefficient is that it does not depend on the units of X and Y and can therefore be used to compare any two variables regardless of their units.

Correlation A positive correlation occurs when the dependent variable increases as the independent variable increases. A negative correlation occurs when the dependent variable increases as the independent variable decreases or vice versa. If a scattergram of the data is not visualized before the r value is calculated, a significant, but nonlinear correlation may be missed.

Regression Regression analysis mathematically describes the dependence of the Y variable on the X variable and constructs an equation which can be used to predict any value of Y for any value of X. It is more specific and provides more information than does correlation. Unlike correlation, however, regression is not scale independent and the derived regression equation depends on the units of each variable involved. As with correlation, regression assumes that each of the variables is normally distributed with equal variance.

Regression Once the equation has been derived, it can be used to predict the change in Y for any change in X. It can therefore be used to extrapolate between the existing data points as well as predict results which have not been previously observed or tested. Regression (also known as simple regression, linear regression, or least squares regression) fits a straight line equation of the following form to the data: Y = a + bX where Y is the dependent variable, X is the single independent variable, a is the Y-intercept of the regression line, and b is the slope of the line (also known as the regression coefficient).