Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 9 Sections 3.3 Objectives:

Similar presentations


Presentation on theme: "Lecture 9 Sections 3.3 Objectives:"— Presentation transcript:

1 Lecture 9 Sections 3.3 Objectives:
Bivariate and Multivariate Data and Distributions Fitting a Straight Line Assessing the fit Residuals

2 Fitting a Line to Bivariate Data
We first draw the scatter plot of two quantitative variables for a visual inspection of relationship between those variables. We then measure its strength by computing a correlation coefficient if the scatter plot shows a linear relationship. Now one may want to fit a line on the scatter plot to summarize the overall pattern. This is done using a regression analysis. Fitting a straight line to data (Regression line) We want to draw a line 1) to summarize the relationship between x and y. 2) to describe how a response variable y changes as an explanatory variable x changes. 3) to predict the value of y for a given value of x.

3 Least Squares Regression Line
The least-squares regression line is the unique line such that the sum of the squared vertical (y) distances between the data points and the line is as small as possible. Distances between the points and line are squared so all are positive values. This is done so that distances can be properly added (Pythagoras).

4 Least Squares Regression Line
Least squares line is a regression line that minimizes the squared distances between the observed points and a line. That is, we’ll find a and b that minimize To find the minimum of this function, take the derivative of Q wrt a and b and equate them to zero. Those two equations (called the normal equations) give Estimated y-intercept Estimated slope The equation of the least square line is often written as where “hat” above y emphasizes that is a prediction of y that results from the substitution of any particular x value into the equation.

5 Example A sample of Pizza restaurants located near college campus
Find a least squares line. b. Predict the quarterly sales when the student populations are 18,000 and 30,000, resp.

6 Extrapolation Extrapolation is the use of a regression line for predictions outside the range of x values used to obtain the line. This is not recommended, as seen here. !!! !!! Height in Inches Height in Inches

7 LS Regression Line Regression and correlation coefficient
Coefficient of determination, denoted by r2, is the proportion of variation in the observed y values that can be explained by the regression line: Note that 1) 0 ≤ r2 ≤ 1 2) the closer this percentage is to 100%, the more successful is the relationship in explaining variation in y. 3) (correlation coefficient)2 = coefficient of determination. Example. Revisit the Pizza sales example. What percent of the variation in the quarterly sales is explained by the regression line?

8 Residuals Predicted value (or fitted value) =
Residual = observed value - predicted value =

9 Residual Plot A residual plot is a scatter plot of the regression residuals against the predicted value or the explanatory variable. Residual plots help us assess the fit of a regression line. If residuals are scattered randomly around 0, chances are your data fit a linear model, was normally distributed, and you didn’t have outliers.

10 Residual Plot Residuals are randomly scattered—good!
Curved pattern—means the relationship you are looking at is not linear. A change in variability across a plot is a warning sign. You need to find out why it is, and remember that predictions made in areas of larger variability will not be as good.

11 Example Consider the following data on x=height (in) and y=average weight (lb) for American females aged (taken from The World Almanac and Book of Facts). x: y: Draw the scatter plot and residual plots. Find the least square regression line.


Download ppt "Lecture 9 Sections 3.3 Objectives:"

Similar presentations


Ads by Google