 Pg 337..345: 3b, 6b (form and strength)  Page 350..359: 10b, 12a, 16c, 16e.

Slides:



Advertisements
Similar presentations
Chapter 4 The Relation between Two Variables
Advertisements

Chapter 3 Bivariate Data
Chapter 6: Exploring Data: Relationships Lesson Plan
2nd Day: Bear Example Length (in) Weight (lb)
Scatter Diagrams and Linear Correlation
AP Statistics Chapters 3 & 4 Measuring Relationships Between 2 Variables.
Chapter 2: Looking at Data - Relationships /true-fact-the-lack-of-pirates-is-causing-global-warming/
BPS - 5th Ed. Chapter 51 Regression. BPS - 5th Ed. Chapter 52 u Objective: To quantify the linear relationship between an explanatory variable (x) and.
Ch 2 and 9.1 Relationships Between 2 Variables
Basic Practice of Statistics - 3rd Edition
Chapter 5 Regression. Chapter 51 u Objective: To quantify the linear relationship between an explanatory variable (x) and response variable (y). u We.
Chapter 5 Regression. Chapter outline The least-squares regression line Facts about least-squares regression Residuals Influential observations Cautions.
Descriptive Methods in Regression and Correlation
Relationship of two variables
 Correlation and regression are closely connected; however correlation does not require you to choose an explanatory variable and regression does. 
ASSOCIATION: CONTINGENCY, CORRELATION, AND REGRESSION Chapter 3.
Looking at data: relationships - Caution about correlation and regression - The question of causation IPS chapters 2.4 and 2.5 © 2006 W. H. Freeman and.
Chapter 6: Exploring Data: Relationships Chi-Kwong Li Displaying Relationships: Scatterplots Regression Lines Correlation Least-Squares Regression Interpreting.
Chapter 6: Exploring Data: Relationships Lesson Plan Displaying Relationships: Scatterplots Making Predictions: Regression Line Correlation Least-Squares.
Notes Bivariate Data Chapters Bivariate Data Explores relationships between two quantitative variables.
AP Statistics Chapter 8 & 9 Day 3
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
Chapter 15 Describing Relationships: Regression, Prediction, and Causation Chapter 151.
Notes Bivariate Data Chapters Bivariate Data Explores relationships between two quantitative variables.
BPS - 3rd Ed. Chapter 51 Regression. BPS - 3rd Ed. Chapter 52 u Objective: To quantify the linear relationship between an explanatory variable (x) and.
Chapter 5 Regression BPS - 5th Ed. Chapter 51. Linear Regression  Objective: To quantify the linear relationship between an explanatory variable (x)
BPS - 5th Ed. Chapter 51 Regression. BPS - 5th Ed. Chapter 52 u Objective: To quantify the linear relationship between an explanatory variable (x) and.
Topic 10 - Linear Regression Least squares principle - pages 301 – – 309 Hypothesis tests/confidence intervals/prediction intervals for regression.
Chapters 8 & 9 Linear Regression & Regression Wisdom.
Relationships If we are doing a study which involves more than one variable, how can we tell if there is a relationship between two (or more) of the.
Examining Bivariate Data Unit 3 – Statistics. Some Vocabulary Response aka Dependent Variable –Measures an outcome of a study Explanatory aka Independent.
CHAPTER 5 Regression BPS - 5TH ED.CHAPTER 5 1. PREDICTION VIA REGRESSION LINE NUMBER OF NEW BIRDS AND PERCENT RETURNING BPS - 5TH ED.CHAPTER 5 2.
Chapter 5 Regression. u Objective: To quantify the linear relationship between an explanatory variable (x) and response variable (y). u We can then predict.
Get out your Residuals Worksheet! You will be able to distinguish between correlation and causation. Today’s Objectives:
AP STATISTICS LESSON 4 – 2 ( DAY 1 ) Cautions About Correlation and Regression.
Chapter 4 – Correlation and Regression before: examined relationship among 1 variable (test grades, metabolism, trip time to work, etc.) now: will examine.
 What is an association between variables?  Explanatory and response variables  Key characteristics of a data set 1.
Lecture 5 Chapter 4. Relationships: Regression Student version.
Chapter 2 Examining Relationships.  Response variable measures outcome of a study (dependent variable)  Explanatory variable explains or influences.
Business Statistics for Managerial Decision Making
^ y = a + bx Stats Chapter 5 - Least Squares Regression
LEAST-SQUARES REGRESSION 3.2 Least Squares Regression Line and Residuals.
Describing Relationships
Unit 4 Lesson 3 (5.3) Summarizing Bivariate Data 5.3: LSRL.
Stat 1510: Statistical Thinking and Concepts REGRESSION.
The correlation coefficient, r, tells us about strength (scatter) and direction of the linear relationship between two quantitative variables. In addition,
Get out p. 193 HW and notes. LEAST-SQUARES REGRESSION 3.2 Interpreting Computer Regression Output.
Chapter 5 Lesson 5.2 Summarizing Bivariate Data 5.2: LSRL.
Chapters 8 Linear Regression. Correlation and Regression Correlation = linear relationship between two variables. Summarize relationship with line. Called.
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
Prediction and Causation How do we predict a response? Explanatory Variables can be used to predict a response: 1. Prediction is based on fitting a line.
CHAPTER 5: Regression ESSENTIAL STATISTICS Second Edition David S. Moore, William I. Notz, and Michael A. Fligner Lecture Presentation.
Describing Relationships. Least-Squares Regression  A method for finding a line that summarizes the relationship between two variables Only in a specific.
Chapter 5: 02/17/ Chapter 5 Regression. 2 Chapter 5: 02/17/2004 Objective: To quantify the linear relationship between an explanatory variable (x)
2.7 The Question of Causation
Chapter 4.2 Notes LSRL.
Statistics 101 Chapter 3 Section 3.
Essential Statistics Regression
LSRL Least Squares Regression Line
Proving Causation Why do you think it was me?!.
Chapter 2: Looking at Data — Relationships
Chapter 2 Looking at Data— Relationships
^ y = a + bx Stats Chapter 5 - Least Squares Regression
Unit 4 Vocabulary.
Basic Practice of Statistics - 5th Edition Regression
Looking at data: relationships - Caution about correlation and regression - The question of causation IPS chapters 2.4 and 2.5 © 2006 W. H. Freeman and.
Least-Squares Regression
Basic Practice of Statistics - 3rd Edition Regression
Basic Practice of Statistics - 3rd Edition Lecture Powerpoint
Honors Statistics Review Chapters 7 & 8
Presentation transcript:

 Pg : 3b, 6b (form and strength)  Page : 10b, 12a, 16c, 16e

 A straight line that describes how a response variable y changes as an explanatory variable x changes.  Often used to predict the value of y for a given value of x.  Also known as a line of best fit.

You can predict the humerus length when the femur length is 50 by following the dotted line to the regression line and then moving horizontally to the humerus values.

 Since different people may draw regression lines differently, an equation may give a more accurate prediction.  Because we want to predict y from x we want a line that is close to the points in the vertical (y) direction.  We need a way to find from the data the equation of the line that comes closest to the points in the vertical direction.  There are many ways to make the collection of vertical distances “as small as possible”…

 Least-Squares Regression Line of y on x is the line that makes the sum of the squares of the vertical distance of the data points from the line as small as possible.

 To find the proper placement of the line, look at the vertical distances of the points near the line you have drawn.  Each data point d will represent the difference between the observed y-value of the point and the predicted y-value (where the line crossed). These differences can be positive, negative, or zero. When the point is above the line, the difference is positive, below the line is negative, on the line is zero.  You then take the sum of all the squared differences.  The regression line is placed where the sum of all the squared differences is the smallest.

 Y = a + bx  b = slope of the line (the amount by which y changes when x increases by 1 unit)  a = the intercept (the value of y when x = 0)  Can be used to predict points.  We will be doing a calculator activity on Tuesday to see how to get these equations from scatterplots.

 We often use several explanatory variables to predict a response.  The basic properties of predicting responses of a least-squares regression line are: ◦ Prediction is based on fitting some “model” to a set of data. (The regression line) ◦ Prediction works best when the model fits the data closely. (More trustworthy if data is close together, if patterns are not strong, prediction may be very inaccurate). ◦ Prediction outside the range of the available data is risky. (Checking within the range is ok, but assuming outside the range may not work—a child’s height for example, if continues on the same rate of growth may make the child 10 feet tall at age 20).

 Correlation and regression are closely connected; however correlation does not require you to choose an explanatory variable and regression does.  Both correlation and regression are strongly affected by outliers…  What do you think Hawaii is known for that is definitely an outlier compared to the other 49 states?

If Hawaii is included, r = 0.408; if Hawaii is not included, r = If Hawaii is included, the LSRL is the solid line; if Hawaii is not included, the LSRL is the dotted line.

 The usefulness of the regression line for prediction depends on the strength of the correlation between the variables.  The square of the correlation is the right measure to use…  r squared will be a number between 0 and 1. The higher the number, higher the amount it accounts for all the variation along the line (you want a high number)…example = 97.2% successful in explaining the regression line.

 A strong relationship between 2 variables does not always mean that changes in one variable cause changes in the other.  The relationship between two variables is often influenced by other variables lurking in the background.  The best evidence for causation comes from randomized comparative experiments.  The observed relationship between 2 variables may be due to direct causation, common response, or confounding.  An observed relationship can be used for prediction without worrying about causation as long as the patterns found in the past data continue to hold true.

 There is a strong relationship between cigarette smoking and death rate from lung cancer. Does smoking cigarettes cause lung cancer?  There is a strong association between the availability of handguns in a nation and that nation’s homicide rate from guns. Does easy access to hand guns cause more murders?

 Does watching television extend your lifespan? ◦ Countries which are rich enough to have televisions are probably also fortunate enough to have better nutrition, clean water, better health care, etc. than poorer nations. ◦ This was called a “nonsense correlation”. The correlation is real, but the conclusion is nonsense.

 Common Response: a lurking variable influences both x and y creates a high correlation even though there is no direct connection between x and y. Ex., obesity in children: a lurking variable can be TV viewing time, but explanatory variables may be inheritance from parents, overeating, or lack of physical activity,

 Confounding: a child may be overweight not because of their poor eating habits but because their parents provide poor choices (their parents have bad eating habits themselves).

 If an experiment is not possible, you must meet the following criteria to prove causation: ◦ The association between the variables is strong. ◦ The association between the variables is consistent. ◦ Higher doses are associated with stronger responses. ◦ The alleged cause precedes the effect in time. ◦ The alleged cause is plausible.

 Pages , #26, 27  Page 384, #35, 36, 37  Page 389, #40

 Pages , #26, 27  Page 384, #35, 36, 37  Page 389, #40