CSE 330: Numerical Methods.  Regression analysis gives information on the relationship between a response (dependent) variable and one or more predictor.

Slides:



Advertisements
Similar presentations
Things to do in Lecture 1 Outline basic concepts of causality
Advertisements

Managerial Economics in a Global Economy
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Chapter 10 Curve Fitting and Regression Analysis
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
P M V Subbarao Professor Mechanical Engineering Department
Read Chapter 17 of the textbook
Curve-Fitting Regression
Least Square Regression
Chapter 4 Multiple Regression.
Petter Mostad Linear regression Petter Mostad
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
8/8/ Linear Regression Major: All Engineering Majors Authors: Autar Kaw, Luke Snyder
Classification and Prediction: Regression Analysis
Relationships Among Variables
Correlation and Linear Regression
Objectives of Multiple Regression
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Linear Regression Analysis Additional Reference: Applied Linear Regression Models – Neter, Kutner, Nachtsheim, Wasserman The lecture notes of Dr. Thomas.
Simple linear regression Linear regression with one predictor variable.
Chapter 6 & 7 Linear Regression & Correlation
Linear Regression James H. Steiger. Regression – The General Setup You have a set of data on two variables, X and Y, represented in a scatter plot. You.
CISE301_Topic41 CISE301: Numerical Methods Topic 4: Least Squares Curve Fitting Lectures 18-19: KFUPM Read Chapter 17 of the textbook.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
MTH 161: Introduction To Statistics
Chapter 20 Linear Regression. What if… We believe that an important relation between two measures exists? For example, we ask 5 people about their salary.
Numerical Methods.  If (n+1) number of points are given, (n+1) number of equations can be formed using a (n+1) order polynomial  The equations can be.
Curve-Fitting Regression
Regression Regression relationship = trend + scatter
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
1 Example 1 Evaluate Solution Since the degree 2 of the numerator equals the degree of the denominator, we must begin with a long division: Thus Observe.
12/17/ Linear Regression Dr. A. Emamzadeh. 2 What is Regression? What is regression? Given n data points best fitto the data. The best fit is generally.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
1 Y SIMPLE REGRESSION MODEL Suppose that a variable Y is a linear function of another variable X, with unknown parameters  1 and  2 that we wish to estimate.
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
CORRELATION-REGULATION ANALYSIS Томский политехнический университет.
Simple linear regression. What is simple linear regression? A way of evaluating the relationship between two continuous variables. One variable is regarded.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
6/13/ Linear Regression Major: All Engineering Majors Authors: Autar Kaw, Charlie Barker Presented by: دکتر ابوالفضل.
Numerical Methods. Introduction Prof. S. M. Lutful Kabir, BRAC University2  One of the most popular techniques for solving simultaneous linear equations.
CSE 330: Numerical Methods. What is regression analysis? Regression analysis gives information on the relationship between a response (dependent) variable.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Numerical Methods. Prof. S. M. Lutful Kabir, BRAC University2  One of the most popular techniques for solving simultaneous linear equations is the Gaussian.
Introduction to Differential Equations
The simple linear regression model and parameter estimation
Chapter 7. Classification and Prediction
Part 5 - Chapter
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
Multiple Regression.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
Statistical Methods For Engineers
LESSON 21: REGRESSION ANALYSIS
Regression Models - Introduction
Simple Linear Regression
Linear regression Fitting a straight line to observations.
Major: All Engineering Majors Authors: Autar Kaw, Luke Snyder
Major: All Engineering Majors Authors: Autar Kaw, Luke Snyder
M248: Analyzing data Block D UNIT D2 Regression.
Nonlinear Fitting.
Discrete Least Squares Approximation
Simple Linear Regression
Regression and Correlation of Data
Chemical Engineering Majors Authors: Autar Kaw, Luke Snyder
Regression Models - Introduction
Presentation transcript:

CSE 330: Numerical Methods

 Regression analysis gives information on the relationship between a response (dependent) variable and one or more predictor (independent) variables  The goal of regression analysis is to express the response variable as a function of the predictor variables  The goodness of fit and the accuracy of conclusion depend on the data used  Hence non-representative or improperly compiled data result in poor fits and conclusions 2

 An example of a regression model is the linear regression model which is a linear relationship between response variable, y and the predictor variable, x i where i=1,2,.....n, of the form (1) where,  are regression coefficients (unknown model parameters), and  is the error due to variability in the observed responses. Prof. S. M. Lutful Kabir, BRAC University3

 In the transformation of raw or uncooked potato to cooked potato, heat is applied for some specific time.  One might postulate that the amount of untransformed portion of the starch (y) inside the potato is a linear function of time (t) and temperature (θ) of cooking. This is represented as  The linear regression refers to finding the unknown parameters, β 1 and β 2 which are simple linear multipliers of the predictor variable. Prof. S. M. Lutful Kabir, BRAC University4

 Three uses for regression analysis are for  model specification  parameter estimation  prediction Prof. S. M. Lutful Kabir, BRAC University5

 Accurate prediction and model specification require that  all relevant variables be accounted for in the data  the prediction equation be defined in the correct functional form for all predictor variables. Prof. S. M. Lutful Kabir, BRAC University6

 Parameter estimation is the most difficult to perform because not only is the model required to be correctly specified, the prediction must also be accurate and the data should allow for good estimation  For example, multi-linear regression creates a problem and requires that some variables may not be used  Thus, limitations of data and inability to measure all predictor variables relevant in a study restrict the use of prediction equations Prof. S. M. Lutful Kabir, BRAC University7

 Regression analysis equations are designed only to make predictions.  Good predictions will not be possible if the model is not correctly specified and accuracy of the parameter not ensured. Prof. S. M. Lutful Kabir, BRAC University8

 For effective use of regression analysis, one should  investigate the data collection process,  discover any limitations in data collected  restrict conclusions accordingly Prof. S. M. Lutful Kabir, BRAC University9

 Linear regression is the most popular regression model. In this model, we wish to predict response to n data points (x 0,y 0 ),(x 1,y 1 ),(x 2,y 2 ).....(x n,y n ) by a regression model given by y = a 0 + a 1 x (1) where, a 0 and a 1 are the constants of the regression model. Prof. S. M. Lutful Kabir, BRAC University10

 A measure of goodness of fit, that is, how well predicts the response variable is the magnitude of the residual at each of the data points. (2)  Ideally, if all the residuals are zero, one may have found an equation in which all the points lie on the model.  Thus, minimization of the residual is an objective of obtaining regression coefficients.  The most popular method to minimize the residual is the least squares methods, where the estimates of the constants of the models are chosen such that the sum of the squared residuals is minimized, that is minimize Prof. S. M. Lutful Kabir, BRAC University11

 Let us use the least squares criterion where we minimize (3) where, S r is called the sum of the square of the residuals.  Differentiating Equation (3) with respect to a 0 and a 1 we get (4) (5) Prof. S. M. Lutful Kabir, BRAC University12

 Using equation (4) and (5), we get (6) (7)  Noting that (8) (9) Prof. S. M. Lutful Kabir, BRAC University13

 Solving the above equations (8) and (9) gives (10) (11) Prof. S. M. Lutful Kabir, BRAC University14

 The torque T needed to turn the torsional spring of a mousetrap through an angle, θ is given below  Find the constants and of the regression model Prof. S. M. Lutful Kabir, BRAC University15 Angle θ, Radians Torque, T

iθTθ2θ2 TθTθ RadiansN-mradiansN-m Prof. S. M. Lutful Kabir, BRAC University16

= X N-m/radk 1 = X 10-1 N-m = X N-m = X N-m/rad 17

Prof. S. M. Lutful Kabir, BRAC University18

 For the following points, find a regression for  (a) 1 st order  (b)2 nd order Prof. S. M. Lutful Kabir, BRAC University19 xY

 Generalizing from a stright line (i.e. First degree polynomial) to a kth degree polynomial y=a 0 +a 1 x+a 2 x 2 +a 3 x a k x k The residual is given by Prof. S. M. Lutful Kabir, BRAC University20

 The partial derivatives are: 21

Prof. S. M. Lutful Kabir, BRAC University22 [C] [A] [B]

Prof. S. M. Lutful Kabir, BRAC University23 i = 1 j = 1 c(i,j) = 0.0 m= 1 c(i,j) = c(i,j)+x(m)^(i-1+j-1) m = m +1 m : n j = j + 1 i = i + 1 j : k+1 i : k+1 < < < > > >

 Class exercise Prof. S. M. Lutful Kabir, BRAC University24

%Regression Analysis % k-> order of polynomial % n-> number of points clear all clc k=1; n=5; x=[0.6981, , , , ]; y=[0.1882, , , , ]; Prof. S. M. Lutful Kabir, BRAC University25

% Determination of [C] matrix for i=1:k+1 for j=1:k+1 c(i,j)=0.0; for m=1:n c(i,j) = c(i,j) + x(m)^(i-1+j-1); end c % Inversion of [C] matrix ci=inv(c); ci Prof. S. M. Lutful Kabir, BRAC University26

% Determination of [B] matrix for i=1:k+1 b(i)=0.0; for m=1:n b(i)=b(i)+y(m)*x(m)^(i-1); end b Prof. S. M. Lutful Kabir, BRAC University27 % Determination of [A] matrix for i=1:k+1 a(i)=0.0; for j=1:k+1 a(i)=a(i)+ci(i,j)*b(j); end a

Thanks Prof. S. M. Lutful Kabir, BRAC University28