COMP 116: Introduction to Scientific Programming Lecture 11: Linear Regression.

Slides:



Advertisements
Similar presentations
Ordinary Least-Squares
Advertisements

Linear Inverse Problems
Machine Learning and Data Mining Linear regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Ch11 Curve Fitting Dr. Deshi Ye
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
A Short Introduction to Curve Fitting and Regression by Brad Morantz
Read Chapter 17 of the textbook
3D Geometry for Computer Graphics. 2 The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
1 MF-852 Financial Econometrics Lecture 2 Matrix Operations in Econometrics, Optimization with Excel Roy J. Epstein Fall 2003.
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
Lecture 12 Least Square Approximation Shang-Hua Teng.
Computer Graphics Recitation The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.
Matrix Approach to Simple Linear Regression KNNL – Chapter 5.
Chapter 10 Real Inner Products and Least-Square (cont.)
Principles of Least Squares
Section 8.3 – Systems of Linear Equations - Determinants Using Determinants to Solve Systems of Equations A determinant is a value that is obtained from.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Solve Systems of Equations By Graphing
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Lecture 12: Image alignment CS4670/5760: Computer Vision Kavita Bala
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Scientific Computing Linear Least Squares. Interpolation vs Approximation Recall: Given a set of (x,y) data points, Interpolation is the process of finding.
Linear Regression James H. Steiger. Regression – The General Setup You have a set of data on two variables, X and Y, represented in a scatter plot. You.
CISE301_Topic41 CISE301: Numerical Methods Topic 4: Least Squares Curve Fitting Lectures 18-19: KFUPM Read Chapter 17 of the textbook.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Linear Programming: Data Fitting Steve Gu Mar 21, 2008.
G(m)=d mathematical model d data m model G operator d=G(m true )+  = d true +  Forward problem: find d given m Inverse problem (discrete parameter estimation):
ENE MATHLAB ® Lecture 3: Matrix Review. Determinant  To use MATLAB to compute determinants:  Enter the determinant as an array.  Use ‘det’ command.
Biostatistics Lecture 17 6/15 & 6/16/2015. Chapter 17 – Correlation & Regression Correlation (Pearson’s correlation coefficient) Linear Regression Multiple.
Thomas Knotts. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
تهیه کننده : نرگس مرعشی استاد راهنما : جناب آقای دکتر جمشید شنبه زاده.
9.2A- Linear Regression Regression Line = Line of best fit The line for which the sum of the squares of the residuals is a minimum Residuals (d) = distance.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 4. Least squares.
Introduction to Matrices and Matrix Approach to Simple Linear Regression.
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
3.1 – Solve Linear Systems by Graphing A system of two linear equations in two variables x and y, also called a linear system, consists of two equations.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Fall 2006AE6382 Design Computing1 Advanced Matrix Applications in Matlab Now we’ll show a few relatively simple examples to illustrate some of the powerful.
Lecture 6 - Single Variable Problems & Systems of Equations CVEN 302 June 14, 2002.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
GEOMETRIC CAMERA CALIBRATION The Calibration Problem Least-Squares Techniques Linear Calibration from Points Reading: Chapter 3.
Section 8 Numerical Analysis CSCI E-24 José Luis Ramírez Herrán October 20, 2015.
Lecture 10: Image alignment CS4670/5760: Computer Vision Noah Snavely
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
Lecture 2 Linear Inverse Problems and Introduction to Least Squares.
Lecture 16: Image alignment
Linear independence and matrix rank
Ch3: Model Building through Regression
Ch12.1 Simple Linear Regression
Simple Linear Regression - Introduction
Matrices Definition: A matrix is a rectangular array of numbers or symbolic elements In many applications, the rows of a matrix will represent individuals.
Some useful linear algebra
Regression Models - Introduction
Linear regression Fitting a straight line to observations.
Lecture 11 Matrices and Linear Algebra with MATLAB
Least Squares Fitting A mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the.
Nonlinear Fitting.
Discrete Least Squares Approximation
CALCULATING EQUATION OF LEAST SQUARES REGRESSION LINE
Lecture 8: Image alignment
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Lecture 8: Image alignment
Chapter 12: Data Analysis by linear least squares
Regression and Correlation of Data
Lecture 11: Image alignment, Part 2
Presentation transcript:

COMP 116: Introduction to Scientific Programming Lecture 11: Linear Regression

Revisiting linear programs A store has requested a manufacturer to produce pants and sports jackets. For materials, the manufacturer has 750 m 2 of cotton textile and 1,000 m 2 of polyester. Every pair of pants (1 unit) needs 1 m 2 of cotton and 2 m 2 of polyester. Every jacket needs 1.5 m 2 of cotton and 1 m 2 of polyester. The price of the pants is fixed at $50 and the jacket, $40. What is the number of pants and jackets that the manufacturer must give to the stores so that these items obtain a maximum revenue?

Solving linear programs Design a linear program ◦ What are the unknowns? ◦ What is the cost/objective function? ◦ What are the constraints? Implement it in Matlab ◦ x=linprog(f,A,b)

Linear Equations When is there a solution for a m x n matrix A? ◦ A is square (m = n)  A has rank m  Equations are linearly independent ◦ m < n  More unknown than equations  A is underdetermined ◦ m > n  More equations than unknowns  A is overdetermined

Least Squares Overdetermined systems: too many equations! What to do? ◦ Can’t solve Ax = b exactly Instead, minimize the “residual” Residual vector: Ax - b Residual square error: (Ax – b) ’ *(Ax – b) Matlab command >>x=A\b

Example: Linear Friction You measured friction in response to a force. What’s the best fitting line to the data?

Example: Linear Friction You measured friction in response to a force. What’s the best fitting line to the data?

Least Squares % set up and solve matrix % equation A*[m;c] = y A = [x, ones(10,1)]; mc = A\y mc = % evaluate error res = A*mc - y; err = res'*res err =

Least Squares In general: Once you have Mathematical model with parameters Observed data Fit parameters to data by minimizing sum of squared residuals

Least Squares Linear least squares if model is linear in the parameters Written as Ax = b A, b observed data x is set of unknowns Error (scalar): squared residual (Ax-b)'*(Ax-b)

Same LS principle Minimize sum of squared distance to model (parabola).