Download presentation

Presentation is loading. Please wait.

Published byAgustin Shockley Modified over 4 years ago

1
Zhen Lu CPACT University of Newcastle MDC Technology Reduced Hessian Sequential Quadratic Programming(SQP)

2
Background BSc in Automatic Control, Tsinghua University, China MSc in Automatic Control, Tsinghua University, China The first year PhD student, CPACT, University of Newcastle, UK

3
Research area My research area : Process optimization; On-line optimization; Optimizing control Research Project - Optimization of Batch Reactor Operations (MDC)

4
Introduction Disadvantages of SQP Advantages of reduced Hessian SQP Description of rSQP Implementation of rSQP Summarize Numerical examples Conclusion Future work

5
Disadvantages of SQP methods In the SQP method, large and sparse QP sub-problems must be solved at each iteration. This can be computationally intensive to solve. Many chemical process optimization problems have a small number of degrees of freedom. A mixture of analytical second derivatives and many small, dense quasi-Newton updates are used to approximate the Hessian matrix of the Lagrangian function in the full space of the variables.

6
Advantages of reduced Hessian SQP The reduced Hessian SQP is designed for large Non-linear Programming(NLP) problems with few degrees of freedom. The approach only requires projected second derivative information and this can often be approximated efficiently with quasi-Newton update formulae. This feature makes rSQP especially attractive for process systems where second derivative information may be difficult or computationally intensive to obtain.

7
Advantages of reduced Hessian SQP Reduced Hessian SQP methods project the quadratic programming sub-problem into the reduced space of independent variables. Refinements of the reduced Hessian SQP approach guarantee a one-step super-linear convergence rate.

8
Description of rSQP Optimization problems of the form:

9
Description of rSQP Quadratic sub-problem:

10
Description of rSQP To compute the search direction, the null-space approach is used. The solution is written as: Where is an matrix spanning the null space of, is an matrix spanning the range of. and

11
Description of rSQP The QP sub-problem can be expressed by: The solution is :

12
Description of rSQP The components of x are grouped into m basic, or dependent variables and non-basic or control variables. The columns of A are grouped accordingly:

13
Description of rSQP When the number of variables n is large and the number of degrees of freedom n-m is small, it is attractive to approximate the reduced Hessian. To ensure that good search directions are always generated, the algorithm approximates the cross term by a vector :

14
Description of rSQP is approximated by a quasi-Newton matrix The reduced Hessian matrix is approximated by a positive definite quasi-Newton matrix

15
Implementation of rSQP Update S

16
Implementation of rSQP Update B

17
Summarize The algorithm does not require the computation of the Hessian of the Lagrangian. The algorithm only makes use of first derivatives of objective function and constraints. The reduced Hessian matrix is approximated by a positive definite quasi-Newton matrix.

18
Numerical examples Model 1: degrees of freedom = 1

19
Numerical examples Model 2: degrees of freedom = 50

20
Numerical examples Model 3: x0=[1.1, 1.1, ……, 1.1] degrees of freedom = 1

21
Numerical examples Model 3: x0=[0.1, 0.1, ……, 0.1] degrees of freedom = 1

22
Numerical examples Model 3: x0=[2.1, 2.1, ……, 2.1] degrees of freedom = 1

23
Conclusion The algorithm is well-suited for large problems with few degrees of freedom. Reduced Hessian SQP approach saves the time of computing Hessian matrix, cuts down the cost of computation. Reduced Hessian SQP algorithm is at least as robust as SQP method.

24
Future work Use differential algebraic equations as constraints. Apply reduced Hessian SQP method to batch and continuous processes.

Similar presentations

OK

Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.

Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.

© 2018 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google