Presentation is loading. Please wait.

Presentation is loading. Please wait.

Methods For Nonlinear Least-Square Problems

Similar presentations


Presentation on theme: "Methods For Nonlinear Least-Square Problems"— Presentation transcript:

1 Methods For Nonlinear Least-Square Problems
Jinxiang Chai

2 Applications Inverse kinematics Physically-based animation
Data-driven motion synthesis Many other problems in graphics, vision, machine learning, robotics, etc.

3 Problem Definition Most optimization problem can be formulated as a nonlinear least squares problem Where , i=1,…,m are given functions, and m>=n

4 Data Fitting

5 Data Fitting

6 Inverse Kinematics Find the joint angles θ that minimizes the distance between the character position and user specified position θ2 θ2 l2 l1 θ1 C=(c1,c2) Base (0,0)

7 Global Minimum vs. Local Minimum
Finding the global minimum for nonlinear functions is very hard Finding the local minimum is much easier

8 Assumptions The cost function F is differentiable and so smooth that the following Taylor expansion is valid,

9 Gradient Descent Objective function: Which direction is optimal?

10 Gradient Descent Which direction is optimal?

11 Gradient Descent A first-order optimization algorithm.
To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient of the function at the current point.

12 Gradient Descent Initialize k=0, choose x0 While k<kmax

13 Newton’s Method Quadratic approximation
What’s the minimum solution of the quadratic approximation

14 Newton’s Method High dimensional case: What’s the optimal direction?

15 Newton’s Method Initialize k=0, choose x0 While k<kmax

16 Newton’s Method Finding the inverse of the Hessian matrix is often expensive Approximation methods are often used - conjugate gradient method - quasi-newton method

17 Comparison Newton’s method vs. Gradient descent

18 Gauss-Newton Methods Often used to solve non-linear least squares problems. Define We have

19 Gauss-Newton Method In general, we want to minimize a sum of squared function values

20 Gauss-Newton Method In general, we want to minimize a sum of squared function values Unlike Newton’s method, second derivatives are not required.

21 Gauss-Newton Method In general, we want to minimize a sum of squared function values

22 Gauss-Newton Method In general, we want to minimize a sum of squared function values Quadratic function

23 Gauss-Newton Method In general, we want to minimize a sum of squared function values Quadratic function

24 Gauss-Newton Method In general, we want to minimize a sum of squared function values Quadratic function

25 Gauss-Newton Method In general, we want to minimize a sum of squared function values Quadratic function

26 Gauss-Newton Method Initialize k=0, choose x0 While k<kmax

27 Gauss-Newton Method In general, we want to minimize a sum of squared function values Any Problem? Quadratic function

28 Gauss-Newton Method In general, we want to minimize a sum of squared function values Any Problem? Quadratic function

29 Gauss-Newton Method In general, we want to minimize a sum of squared function values Any Problem? Quadratic function Solution might not be unique!

30 Gauss-Newton Method In general, we want to minimize a sum of squared function values Any Problem? Quadratic function Add regularization term!

31 Levenberg-Marquardt Method
In general, we want to minimize a sum of squared function values Any Problem?

32 Levenberg-Marquardt Method
In general, we want to minimize a sum of squared function values Any Problem? Quadratic function Add regularization term!

33 Levenberg-Marquardt Method
In general, we want to minimize a sum of squared function values Any Problem? Quadratic function Add regularization term!

34 Levenberg-Marquardt Method
Initialize k=0, choose x0 While k<kmax

35 Stopping Criteria Criterion 1: reach the number of iteration specified by the user K>kmax

36 Stopping Criteria Criterion 1: reach the number of iteration specified by the user Criterion 2: when the current function value is smaller than a user-specified threshold K>kmax F(xk)<σuser

37 Stopping Criteria Criterion 1: reach the number of iteration specified by the user Criterion 2: when the current function value is smaller than a user-specified threshold Criterion 3: when the change of function value is smaller than a user specified threshold K>kmax F(xk)<σuser ||F(xk)-F(xk-1)||<εuser

38 Levmar Library Implementation of the Levenberg-Marquardt algorithm

39 Constrained Nonlinear Optimization
Finding the minimum value while satisfying some constraints


Download ppt "Methods For Nonlinear Least-Square Problems"

Similar presentations


Ads by Google