How do we find the best linear regression line?

Slides:



Advertisements
Similar presentations
Chapter 13 Functions of Several Variables. Copyright © Houghton Mifflin Company. All rights reserved.13-2 Definition of a Function of Two Variables.
Advertisements

ESSENTIAL CALCULUS CH11 Partial derivatives
 Related Rates ◦ Idea:  Given two quantities that 1.Are somehow related 2.Changing (usually w.r.t. time)  The problem is to find how one of these quantities.
A Casual Chat on Convex Optimization in Machine Learning Data Mining at Iowa Group Qihang Lin 02/09/2014.
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
CSci 6971: Image Registration Lecture 4: First Examples January 23, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI Dr.
Copyright © 2005 Pearson Education, Inc. Publishing as Pearson Addison-Wesley.
Linear Regression  Using a linear function to interpolate the training set  The most popular criterion: Least squares approach  Given the training set:
The Widrow-Hoff Algorithm (Primal Form) Repeat: Until convergence criterion satisfied return: Given a training set and learning rate Initial:  Minimize.
The Perceptron Algorithm (Dual Form) Given a linearly separable training setand Repeat: until no mistakes made within the for loop return:
© 2010 Pearson Education, Inc. All rights reserved.
Copyright © 2005 Pearson Education, Inc. Publishing as Pearson Addison-Wesley.
Chapter 4 Multiple Regression. 4.1 Introduction.
ESTIMATING THE REGRESSION COEFFICIENTS FOR SIMPLE LINEAR REGRESSION.
Multiple Regression. Want to find the best linear relationship between a dependent variable, Y, (Price), and 3 independent variables X 1 (Sq. Feet), X.
 By River, Gage, Travis, and Jack. Sections Chapter 6  6.1- Introduction to Differentiation (Gage)  The Gradient Function (Gage)  Calculating.
1.3 “Solving Linear Equations” Steps: 1.Isolate the variable. 2.To solve when there is a fraction next to a variable, multiply both sides by the reciprocal.
Copyright © 2011 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 12 Functions of Several Variables.
Application of Differential Applied Optimization Problems.
CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University Course website:
Model representation Linear regression with one variable
Non-Bayes classifiers. Linear discriminants, neural networks.
Class 20, November 10, 2015 Lessons 3.6 & 3.7.  By the end of this lesson, you should understand that: ◦ Addition and subtraction are inverse operations.
Differential Equations Linear Equations with Variable Coefficients.
The Chain Rule. The Chain Rule Case I z x y t t start with z z is a function of x and y x and y are functions of t Put the appropriate derivatives along.
MA Day 34- February 22, 2013 Review for test #2 Chapter 11: Differential Multivariable Calculus.
Copyright Kaplan AEC Education, 2008 Calculus and Differential Equations Outline Overview DIFFERENTIAL CALCULUS, p. 45 Definition of a Function Definition.
Occlusion Tracking Using Logical Models Summary. A Variational Partial Differential Equations based model is used for tracking objects under occlusions.
Introduction to Differential Equations
Chapter 3 Derivatives.
Functions of Several Variables
Regression and Correlation of Data Summary
Derivatives of exponentials and Logarithms
Chapter 14 Partial Derivatives.
Introduction to Differential Equations
Students will be able to:
Solving Linear Equations and Inequalities
Chain Rules for Functions of Several Variables
Chapter 3 Differentiation
Chain Rules for Functions of Several Variables
Solving Multi-Step Equations
Learning Target I can solve and graph multi-step, one-variable inequalities using algebraic properties.
Solving Linear Equations with Fractions
Differential Equations Separation of Variables
Identifying Graphs of Linear Equations
Solving Multi-Step Equations
Ying shen Sse, tongji university Sep. 2016
Solving Multi-Step Equations
Implicit Differentiation
Instructor :Dr. Aamer Iqbal Bhatti
Solving Percent Problem with Equations (7.2) Proportions (7.3)
Solving Multi-Step Equations
Rates that Change Based on another Rate Changing
Differential Equations
Multiple Linear Regression
EQUATION 4.1 Relationship Between One Dependent and One Independent Variable: Simple Regression Analysis.
Bellwork Solve
Solving Multi-Step Equations
The Math of Machine Learning
Number Lines.
Solving Multi-Step Equations
IF YOU MULTIPLY or DIVIDE BY A NEGATIVE YOU MUST SWITCH THE SIGN!
1.01 Rearranging Linear Inequalities
Cases. Simple Regression Linear Multiple Regression.
The Chain Rule Section 3.4.
1. How do I Solve Linear Equations
Maximum and Minimum Points
Regression and Correlation of Data
Linear regression with one variable
2008 Multiple Choice.
Presentation transcript:

How do we find the best linear regression line?

How do we find the best linear regression line with multiple variables?

Partial Differential Equations!

Revisit SSE Both are basic applications of the chain rule 

Gradient Descent Gradient descent is an optimization algorithm for finding the minimum of a function. To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient of the function at the current point. These steps are governed by a learning rate.

How do we set the learning rate?

Excel example

Java Example!

Python Example

Gradient Descent Applications Multi-variable Regression