Zhen Lu CPACT University of Newcastle MDC Technology Reduced Hessian Sequential Quadratic Programming(SQP)

Slides:



Advertisements
Similar presentations
Yi Heng Second Order Differentiation Bommerholz – Summer School 2006.
Advertisements

Numerical Solution of Linear Equations
Fixed point iterations and solution of non-linear functions
Lect.3 Modeling in The Time Domain Basil Hamed
Optimization.
Engineering Optimization
1 TTK4135 Optimization and control B.Foss Spring semester 2005 TTK4135 Optimization and control Spring semester 2005 Scope - this you shall learn Optimization.
Slide 4b.1 Stiff Structures, Compliant Mechanisms, and MEMS: A short course offered at IISc, Bangalore, India. Aug.-Sep., G. K. Ananthasuresh Lecture.
Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
July 11, 2006 Comparison of Exact and Approximate Adjoint for Aerodynamic Shape Optimization ICCFD 4 July 10-14, 2006, Ghent Giampietro Carpentieri and.
The value of kernel function represents the inner product of two training points in feature space Kernel functions merge two steps 1. map input data from.
Function Optimization Newton’s Method. Conjugate Gradients
Unconstrained Optimization Rong Jin. Recap  Gradient ascent/descent Simple algorithm, only requires the first order derivative Problem: difficulty in.
Prénom Nom Document Analysis: Linear Discrimination Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Tutorial 12 Unconstrained optimization Conjugate gradients.
Unconstrained Optimization Problem
U NIVERSITY OF M ASSACHUSETTS, A MHERST Department of Computer Science Optimal Fixed-Size Controllers for Decentralized POMDPs Christopher Amato Daniel.
Ordinary least squares regression (OLS)
Advanced Topics in Optimization
Ordinary Differential Equations (ODEs) 1Daniel Baur / Numerical Methods for Chemical Engineers / Implicit ODE Solvers Daniel Baur ETH Zurich, Institut.
Monica Garika Chandana Guduru. METHODS TO SOLVE LINEAR SYSTEMS Direct methods Gaussian elimination method LU method for factorization Simplex method of.
1 Multiple Kernel Learning Naouel Baili MRL Seminar, Fall 2009.
Newton's Method for Functions of Several Variables
Unconstrained Optimization Rong Jin. Logistic Regression The optimization problem is to find weights w and b that maximizes the above log-likelihood How.
Tier I: Mathematical Methods of Optimization

Numerical Analysis 1 EE, NCKU Tien-Hao Chang (Darby Chang)
UNCONSTRAINED MULTIVARIABLE
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Frank Edward Curtis Northwestern University Joint work with Richard Byrd and Jorge Nocedal February 12, 2007 Inexact Methods for PDE-Constrained Optimization.
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
Newton's Method for Functions of Several Variables Joe Castle & Megan Grywalski.
Application of Differential Applied Optimization Problems.
Start Presentation October 4, 2012 Solution of Non-linear Equation Systems In this lecture, we shall look at the mixed symbolic and numerical solution.
Simplex method (algebraic interpretation)
EMIS 8381 – Spring Netflix and Your Next Movie Night Nonlinear Programming Ron Andrews EMIS 8381.
1 Unconstrained Optimization Objective: Find minimum of F(X) where X is a vector of design variables We may know lower and upper bounds for optimum No.
Frank Edward Curtis Northwestern University Joint work with Richard Byrd and Jorge Nocedal January 31, 2007 Inexact Methods for PDE-Constrained Optimization.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Numerical Methods.
MECH4450 Introduction to Finite Element Methods Chapter 9 Advanced Topics II - Nonlinear Problems Error and Convergence.
Data Modeling Patrice Koehl Department of Biological Sciences National University of Singapore
Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
Lecture 6 - Single Variable Problems & Systems of Equations CVEN 302 June 14, 2002.
MECH593 Introduction to Finite Element Methods
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Introduction to Polynomials 24 February What is a polynomial? polynomial.
S5.40. Module Structure 30% practical tests / 70% written exam 3h lectures / week (except reading week) 3 x 2h of computer labs (solving problems practicing.
Introduction to Differential Equations
Bounded Nonlinear Optimization to Fit a Model of Acoustic Foams
Root Finding Methods Fish 559; Lecture 15 a.
Multiplicative updates for L1-regularized regression
Non-linear Minimization
A Fast Trust Region Newton Method for Logistic Regression
Solving Systems of Linear Equations: Iterative Methods
Computational Optimization
CS B553: Algorithms for Optimization and Learning
سمینار درس کنترل پیش بین
CS5321 Numerical Optimization
CS5321 Numerical Optimization
CS5321 Numerical Optimization
CS5321 Numerical Optimization
CS5321 Numerical Optimization
Computer Animation Algorithms and Techniques
Null Spaces, Column Spaces, and Linear Transformations
Outline Sparse Reconstruction RIP Condition
CS5321 Numerical Optimization
Presentation transcript:

Zhen Lu CPACT University of Newcastle MDC Technology Reduced Hessian Sequential Quadratic Programming(SQP)

Background BSc in Automatic Control, Tsinghua University, China MSc in Automatic Control, Tsinghua University, China The first year PhD student, CPACT, University of Newcastle, UK

Research area My research area : Process optimization; On-line optimization; Optimizing control Research Project - Optimization of Batch Reactor Operations (MDC)

Introduction Disadvantages of SQP Advantages of reduced Hessian SQP Description of rSQP Implementation of rSQP Summarize Numerical examples Conclusion Future work

Disadvantages of SQP methods In the SQP method, large and sparse QP sub-problems must be solved at each iteration. This can be computationally intensive to solve. Many chemical process optimization problems have a small number of degrees of freedom. A mixture of analytical second derivatives and many small, dense quasi-Newton updates are used to approximate the Hessian matrix of the Lagrangian function in the full space of the variables.

Advantages of reduced Hessian SQP The reduced Hessian SQP is designed for large Non-linear Programming(NLP) problems with few degrees of freedom. The approach only requires projected second derivative information and this can often be approximated efficiently with quasi-Newton update formulae. This feature makes rSQP especially attractive for process systems where second derivative information may be difficult or computationally intensive to obtain.

Advantages of reduced Hessian SQP Reduced Hessian SQP methods project the quadratic programming sub-problem into the reduced space of independent variables. Refinements of the reduced Hessian SQP approach guarantee a one-step super-linear convergence rate.

Description of rSQP Optimization problems of the form:

Description of rSQP Quadratic sub-problem:

Description of rSQP To compute the search direction, the null-space approach is used. The solution is written as: Where is an matrix spanning the null space of, is an matrix spanning the range of. and

Description of rSQP The QP sub-problem can be expressed by: The solution is :

Description of rSQP The components of x are grouped into m basic, or dependent variables and non-basic or control variables. The columns of A are grouped accordingly:

Description of rSQP When the number of variables n is large and the number of degrees of freedom n-m is small, it is attractive to approximate the reduced Hessian. To ensure that good search directions are always generated, the algorithm approximates the cross term by a vector :

Description of rSQP is approximated by a quasi-Newton matrix The reduced Hessian matrix is approximated by a positive definite quasi-Newton matrix

Implementation of rSQP Update S

Implementation of rSQP Update B

Summarize The algorithm does not require the computation of the Hessian of the Lagrangian. The algorithm only makes use of first derivatives of objective function and constraints. The reduced Hessian matrix is approximated by a positive definite quasi-Newton matrix.

Numerical examples Model 1: degrees of freedom = 1

Numerical examples Model 2: degrees of freedom = 50

Numerical examples Model 3: x0=[1.1, 1.1, ……, 1.1] degrees of freedom = 1

Numerical examples Model 3: x0=[0.1, 0.1, ……, 0.1] degrees of freedom = 1

Numerical examples Model 3: x0=[2.1, 2.1, ……, 2.1] degrees of freedom = 1

Conclusion The algorithm is well-suited for large problems with few degrees of freedom. Reduced Hessian SQP approach saves the time of computing Hessian matrix, cuts down the cost of computation. Reduced Hessian SQP algorithm is at least as robust as SQP method.

Future work Use differential algebraic equations as constraints. Apply reduced Hessian SQP method to batch and continuous processes.