Methods For Nonlinear Least-Square Problems

Slides:



Advertisements
Similar presentations
Instabilities of SVD Small eigenvalues -> m+ sensitive to small amounts of noise Small eigenvalues maybe indistinguishable from 0 Possible to remove small.
Advertisements

Siddharth Choudhary.  Refines a visual reconstruction to produce jointly optimal 3D structure and viewing parameters  ‘bundle’ refers to the bundle.
Least Squares example There are 3 mountains u,y,z that from one site have been measured as 2474 ft., 3882 ft., and 4834 ft.. But from u, y looks 1422 ft.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
Empirical Maximum Likelihood and Stochastic Process Lecture VIII.
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
CSCE 641: Forward kinematics and inverse kinematics Jinxiang Chai.
Motion Editing and Retargetting Jinxiang Chai. Outline Motion editing [video, click here]here Motion retargeting [video, click here]here.
Numerical Optimization
1cs542g-term Notes  Extra class this Friday 1-2pm  If you want to receive s about the course (and are auditing) send me .
Nonlinear Optimization for Optimal Control
CSCE 641: Forward kinematics and inverse kinematics Jinxiang Chai.
The Widrow-Hoff Algorithm (Primal Form) Repeat: Until convergence criterion satisfied return: Given a training set and learning rate Initial:  Minimize.
Geometric Optimization Problems in Computer Vision.
CSCE 689: Forward Kinematics and Inverse Kinematics
Unconstrained Optimization Problem
Improved BP algorithms ( first order gradient method) 1.BP with momentum 2.Delta- bar- delta 3.Decoupled momentum 4.RProp 5.Adaptive BP 6.Trinary BP 7.BP.
12 1 Variations on Backpropagation Variations Heuristic Modifications –Momentum –Variable Learning Rate Standard Numerical Optimization –Conjugate.
Advanced Topics in Optimization
Optimization methods Aleksey Minin Saint-Petersburg State University Student of ACOPhys master program (10 th semester) 1 Joint Advanced Students School.
Why Function Optimization ?
Math for CSLecture 51 Function Optimization. Math for CSLecture 52 There are three main reasons why most problems in robotics, vision, and arguably every.
Optimization Methods One-Dimensional Unconstrained Optimization
CSCE 641 Computer Graphics: Image-based Modeling (Cont.) Jinxiang Chai.

9 1 Performance Optimization. 9 2 Basic Optimization Algorithm p k - Search Direction  k - Learning Rate or.
UNCONSTRAINED MULTIVARIABLE
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 9. Optimization problems.
Collaborative Filtering Matrix Factorization Approach
Frank Edward Curtis Northwestern University Joint work with Richard Byrd and Jorge Nocedal February 12, 2007 Inexact Methods for PDE-Constrained Optimization.
ENCI 303 Lecture PS-19 Optimization 2
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
Nonlinear programming Unconstrained optimization techniques.
Fin500J: Mathematical Foundations in Finance
Camera calibration Digital Visual Effects Yung-Yu Chuang with slides by Richard Szeliski, Steve Seitz,, Fred Pighin and Marc Pollefyes.
CSCE 643 Computer Vision: Structure from Motion
LTSI (1) Faculty of Mech. & Elec. Engineering, University AL-Baath, Syria Ahmad Karfoul (1), Julie Coloigner (2,3), Laurent Albera (2,3), Pierre Comon.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
Frank Edward Curtis Northwestern University Joint work with Richard Byrd and Jorge Nocedal January 31, 2007 Inexact Methods for PDE-Constrained Optimization.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
CSCE 441: Computer Graphics Forward/Inverse kinematics Jinxiang Chai.
A comparison between PROC NLP and PROC OPTMODEL Optimization Algorithm Chin Hwa Tan December 3, 2008.
Quasi-Newton Methods of Optimization Lecture 2. General Algorithm n A Baseline Scenario Algorithm U (Model algorithm for n- dimensional unconstrained.
CHAPTER 10 Widrow-Hoff Learning Ming-Feng Yeh.
Variations on Backpropagation.
Survey of unconstrained optimization gradient based algorithms
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
CSCE 441: Computer Graphics Forward/Inverse kinematics Jinxiang Chai.
Regularized Least-Squares and Convex Optimization.
Numerical Methods for Inverse Kinematics Kris Hauser ECE 383 / ME 442.
Assignment 6 IK + Interpolation. Objective Get key poses by IK Implement the Catmull-Rom algorithm to interpolate poses for in-between frames Display.
Function Optimization
Bounded Nonlinear Optimization to Fit a Model of Acoustic Foams
CSCE 441: Computer Graphics Forward/Inverse kinematics
CPSC 641: Image Registration
CS5321 Numerical Optimization
Non-linear Least-Squares
Collaborative Filtering Matrix Factorization Approach
CS5321 Numerical Optimization
CSCE 441: Computer Graphics Forward/Inverse kinematics
Ying shen Sse, tongji university Sep. 2016
Outline Single neuron case: Nonlinear error correcting learning
Structure from Motion with Non-linear Least Squares
~ Least Squares example
~ Least Squares example
Neural Network Training
Performance Optimization
Outline Preface Fundamentals of Optimization
Structure from Motion with Non-linear Least Squares
Presentation transcript:

Methods For Nonlinear Least-Square Problems Jinxiang Chai

Applications Inverse kinematics Physically-based animation Data-driven motion synthesis Many other problems in graphics, vision, machine learning, robotics, etc.

Problem Definition Most optimization problem can be formulated as a nonlinear least squares problem Where , i=1,…,m are given functions, and m>=n

Data Fitting

Data Fitting

Inverse Kinematics Find the joint angles θ that minimizes the distance between the character position and user specified position θ2 θ2 l2 l1 θ1 C=(c1,c2) Base (0,0)

Global Minimum vs. Local Minimum Finding the global minimum for nonlinear functions is very hard Finding the local minimum is much easier

Assumptions The cost function F is differentiable and so smooth that the following Taylor expansion is valid,

Gradient Descent Objective function: Which direction is optimal?

Gradient Descent Which direction is optimal?

Gradient Descent A first-order optimization algorithm. To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient of the function at the current point.

Gradient Descent Initialize k=0, choose x0 While k<kmax

Newton’s Method Quadratic approximation What’s the minimum solution of the quadratic approximation

Newton’s Method High dimensional case: What’s the optimal direction?

Newton’s Method Initialize k=0, choose x0 While k<kmax

Newton’s Method Finding the inverse of the Hessian matrix is often expensive Approximation methods are often used - conjugate gradient method - quasi-newton method

Comparison Newton’s method vs. Gradient descent

Gauss-Newton Methods Often used to solve non-linear least squares problems. Define We have

Gauss-Newton Method In general, we want to minimize a sum of squared function values

Gauss-Newton Method In general, we want to minimize a sum of squared function values Unlike Newton’s method, second derivatives are not required.

Gauss-Newton Method In general, we want to minimize a sum of squared function values

Gauss-Newton Method In general, we want to minimize a sum of squared function values Quadratic function

Gauss-Newton Method In general, we want to minimize a sum of squared function values Quadratic function

Gauss-Newton Method In general, we want to minimize a sum of squared function values Quadratic function

Gauss-Newton Method In general, we want to minimize a sum of squared function values Quadratic function

Gauss-Newton Method Initialize k=0, choose x0 While k<kmax

Gauss-Newton Method In general, we want to minimize a sum of squared function values Any Problem? Quadratic function

Gauss-Newton Method In general, we want to minimize a sum of squared function values Any Problem? Quadratic function

Gauss-Newton Method In general, we want to minimize a sum of squared function values Any Problem? Quadratic function Solution might not be unique!

Gauss-Newton Method In general, we want to minimize a sum of squared function values Any Problem? Quadratic function Add regularization term!

Levenberg-Marquardt Method In general, we want to minimize a sum of squared function values Any Problem?

Levenberg-Marquardt Method In general, we want to minimize a sum of squared function values Any Problem? Quadratic function Add regularization term!

Levenberg-Marquardt Method In general, we want to minimize a sum of squared function values Any Problem? Quadratic function Add regularization term!

Levenberg-Marquardt Method Initialize k=0, choose x0 While k<kmax

Stopping Criteria Criterion 1: reach the number of iteration specified by the user K>kmax

Stopping Criteria Criterion 1: reach the number of iteration specified by the user Criterion 2: when the current function value is smaller than a user-specified threshold K>kmax F(xk)<σuser

Stopping Criteria Criterion 1: reach the number of iteration specified by the user Criterion 2: when the current function value is smaller than a user-specified threshold Criterion 3: when the change of function value is smaller than a user specified threshold K>kmax F(xk)<σuser ||F(xk)-F(xk-1)||<εuser

Levmar Library Implementation of the Levenberg-Marquardt algorithm http://www.ics.forth.gr/~lourakis/levmar/

Constrained Nonlinear Optimization Finding the minimum value while satisfying some constraints