Non-linear Least-Squares

Slides:



Advertisements
Similar presentations
Curved Trajectories towards Local Minimum of a Function Al Jimenez Mathematics Department California Polytechnic State University San Luis Obispo, CA
Advertisements

Instabilities of SVD Small eigenvalues -> m+ sensitive to small amounts of noise Small eigenvalues maybe indistinguishable from 0 Possible to remove small.
Optimization.
Introduction An important research activity in the area of global optimization is to determine an effective strategy for solving least squares problems.
Engineering Optimization
Inversion Transforming the apparent to « real » resistivity. Find a numerical model that explains the field measurment.
Siddharth Choudhary.  Refines a visual reconstruction to produce jointly optimal 3D structure and viewing parameters  ‘bundle’ refers to the bundle.
Least Squares example There are 3 mountains u,y,z that from one site have been measured as 2474 ft., 3882 ft., and 4834 ft.. But from u, y looks 1422 ft.
Nonlinear Regression Ecole Nationale Vétérinaire de Toulouse Didier Concordet ECVPT Workshop April 2011 Can be downloaded at
Aspects of Conditional Simulation and estimation of hydraulic conductivity in coastal aquifers" Luit Jan Slooten.
Steepest Decent and Conjugate Gradients (CG). Solving of the linear equation system.
1cs542g-term Notes  Extra class this Friday 1-2pm  If you want to receive s about the course (and are auditing) send me .
Newton-Gauss Algorithm iii) Calculation the shift parameters vector R (p 0 )dR(p 0 )/dR(p 1 )dR(p 0 )/dR(p 2 )=- - p1p1 p2p2 - … - The Jacobian Matrix.
Motion Analysis (contd.) Slides are from RPI Registration Class.
Methods For Nonlinear Least-Square Problems
Unconstrained Optimization Problem
Newton's Method for Functions of Several Variables
Why Function Optimization ?
1cs426-winter-2008 Notes  SIGGRAPH crunch time - my apologies :-)
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
9 1 Performance Optimization. 9 2 Basic Optimization Algorithm p k - Search Direction  k - Learning Rate or.
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
Chapter 17 Boundary Value Problems. Standard Form of Two-Point Boundary Value Problem In total, there are n 1 +n 2 =N boundary conditions.
ENCI 303 Lecture PS-19 Optimization 2
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Nonlinear least squares Given m data points (t i, y i ) i=1,2,…m, we wish to find a vector x of n parameters that gives a best fit in the least squares.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Camera calibration Digital Visual Effects Yung-Yu Chuang with slides by Richard Szeliski, Steve Seitz,, Fred Pighin and Marc Pollefyes.
LTSI (1) Faculty of Mech. & Elec. Engineering, University AL-Baath, Syria Ahmad Karfoul (1), Julie Coloigner (2,3), Laurent Albera (2,3), Pierre Comon.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Multivariate Unconstrained Optimisation First we consider algorithms for functions for which derivatives are not available. Could try to extend direct.
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
Quasi-Newton Methods of Optimization Lecture 2. General Algorithm n A Baseline Scenario Algorithm U (Model algorithm for n- dimensional unconstrained.
Circuits Theory Examples Newton-Raphson Method. Formula for one-dimensional case: Series of successive solutions: If the iteration process is converged,
Data Modeling Patrice Koehl Department of Biological Sciences National University of Singapore
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
CHAPTER 10 Widrow-Hoff Learning Ming-Feng Yeh.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
CSCE 441: Computer Graphics Forward/Inverse kinematics
CSE 245: Computer Aided Circuit Simulation and Verification
Non-linear Minimization
Gradient Descent 梯度下降法
Digital Visual Effects, Spring 2007 Yung-Yu Chuang 2007/4/17
Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/4/15
Newton’s method for finding local minima
Iterative Non-Linear Optimization Methods
CS5321 Numerical Optimization
Collaborative Filtering Matrix Factorization Approach
CSCE 441: Computer Graphics Forward/Inverse kinematics
Digital Visual Effects Yung-Yu Chuang
Variations on Backpropagation.
Outline Single neuron case: Nonlinear error correcting learning
Chapter 10. Numerical Solutions of Nonlinear Systems of Equations
Structure from Motion with Non-linear Least Squares
CS5321 Numerical Optimization
Nonlinear regression.
Optimization Methods TexPoint fonts used in EMF.
6.5 Taylor Series Linearization
~ Least Squares example
~ Least Squares example
Variations on Backpropagation.
CS5321 Numerical Optimization
Neural Network Training
Performance Optimization
Gradient Descent 梯度下降法
Outline Preface Fundamentals of Optimization
Section 3: Second Order Methods
Numerical Analysis – Solving Nonlinear Equations
Structure from Motion with Non-linear Least Squares
Presentation transcript:

Non-linear Least-Squares Mariolino De Cecco Nicolò Biasi Regularized Least-Squares

Why non-linear? The model is non-linear (e.g. joints, position, ..) The error function is non-linear Regularized Least-Squares

Setup Let f be a function such that where x is a vector of parameters Let {ak,bk} be a set of measurements/constraints. We fit f to the data by solving: Regularized Least-Squares

Example In our case x is the set of e1, e2 shape and a1, a2, a3, size parameters for superquadrics a is the 3D point f is the superquadric in implicit form The residuals are: rk = f(a, x) - 1 Regularized Least-Squares

Overview Existence and uniqueness of minimum Steepest-descent Newton’s method Gauss-Newton’s method Levenberg-Marquardt method Regularized Least-Squares

A non-linear function: the Rosenbrock function Zoom: Global minimum at (1,1) Regularized Least-Squares

Existence of minimum A local minima is characterized by: Regularized Least-Squares

Existence of minimum Regularized Least-Squares

Descent algorithm Start at an initial position x0 Until convergence Find minimizing step dxk xk+1=xk+ dxk Produce a sequence x0, x1, …, xn such that f(x0) > f(x1) > …> f(xn) Regularized Least-Squares

Descent algorithm Start at an initial position x0 Until convergence Find minimizing step dxk using a local approximation of f xk+1=xk+ dxk Produce a sequence x0, x1, …, xn such that f(x0) > f(x1) > …> f(xn) Regularized Least-Squares

Approximation using Taylor series This is the general formulation Regularized Least-Squares

Approximation using Taylor series But in the case of a least squares problem: Regularized Least-Squares

Approximation using Taylor series But in the case of a least squares problem: Regularized Least-Squares

Steepest descent Step where is chosen such that: using a line search algorithm: Regularized Least-Squares

Regularized Least-Squares

Regularized Least-Squares

In the plane of the steepest descent direction Regularized Least-Squares

Rosenbrock function (1000 iterations) Regularized Least-Squares

Newton’s method Determine the step: by a second order approximation At the minimum of N Regularized Least-Squares

Regularized Least-Squares

Problem If is not positive semi-definite, then is not a descent direction: the step increases the error function Uses positive semi-definite approximation of Hessian based on the jacobian (quasi-Newton methods) Regularized Least-Squares

Gauss-Newton method Step: use with the approximate hessian Advantages: No second order derivatives is positive semi-definite Regularized Least-Squares

Rosenbrock function (48 evaluations) Regularized Least-Squares

Problem with Gauss-Newton When the function is locally highly nonlinear the second order solution can lead to non-convergence !!! Regularized Least-Squares

Levenberg-Marquardt algorithm Blends Steepest descent and Gauss-Newton At each step solve, for the descent direction Regularized Least-Squares

Managing the damping parameter General approach: If step fails, increase damping until step is successful If step succeeds, decrease damping to take larger step Improved damping Regularized Least-Squares

Improved damping Means that in the steepest descent phase, if a direction has a low second order derivative, the step in its direction is increased h h Regularized Least-Squares

Rosenbrock function (90 evaluations) Regularized Least-Squares