Presentation is loading. Please wait.

Presentation is loading. Please wait.

System Optimization (1) Liang Yu Department of Biological Systems Engineering Washington State University 04. 16. 2013.

Similar presentations


Presentation on theme: "System Optimization (1) Liang Yu Department of Biological Systems Engineering Washington State University 04. 16. 2013."— Presentation transcript:

1 System Optimization (1) Liang Yu Department of Biological Systems Engineering Washington State University 04. 16. 2013

2 Outline Background of Engineering Optimization Application of Optimization Introduction Fundamentals of Optimization Optimization Toolbox in Matlab

3 Optimization in Process Plants

4 Engineering applications of optimization Some typical applications from different engineering disciplines Design of water resources systems for maximum benefit Design of pumps, turbines, and heat transfer equipment for maximum efficiency Optimum design of chemical processing equipment and plants Selection of a site for an industry Optimum design of control systems

5 Application: Metabolic Flux Analysis Flux Balance Analysis (FBA) in silico simulation Linear programming (LP) Genome-scale 13 C-assisted Metabolic Flux Analysis in vivo search Nonlinear programming (NLP) Simplified model maximize ∑c i ∙v i s.t. S∙v = 0 lb < v < ub minimize (MDV exp -MDV sim ) 2 s.t. S∙v = 0 IDV = f(v, IMM, IDV) MDV = M∙IDV lb < v < ub Metabolic Steady state Metabolic & isotopic Steady state IDV: isotopomer distribution vector MDV: mass distribution vector

6 Application for optimization of biorefinery configurations Pham, Viet, and Mahmoud El ‐ Halwagi. "Process synthesis and optimization of biorefinery configurations." AIChE Journal 58.4 (2012): 1212-1221.

7 Part of the branching trees for the production of bio- alcohols from lignocellulosic bio-mass

8 Optimization Tree

9 Introduction Definition Optimization is the act of obtaining the best result under given circumstances. It can be defined as the process of finding the conditions that give the maximum or minimum value of a function. Goal either to minimize the effort required or to maximize the desired benefit. Optimization problem could be linear or non-linear. Non –linear optimization is accomplished by numerical ‘Search Methods’. Search methods are used iteratively before a solution is achieved. The search procedure is termed as algorithm.

10 Introduction Minimum of f (x) is same as maximum of −f (x) Optimum solution is found while satisfying its constraint (derivative must be zero at optimum).

11 Introduction Linear problem – solved by Simplex or Graphical methods. The solution of the linear problem lies on boundaries of the feasible region. Non-linear problem solution lies within and on the boundaries of the feasible region. Solution of linear problem Three dimensional solution of non-linear problem

12 Introduction Optimization Programming Languages GAMS - General Algebraic Modeling System LINDO - Widely used in business applications AMPL - A Mathematical Programming Language Others: MPL, ILOG Software with Optimization Capabilities Excel – Solver MATLAB MathCAD Mathematica Maple Others

13 Statement of an optimization problem An optimization or a mathematical programming problem can be stated x1 x2... xn which minimizes f (X) Find X = Subject to the constraints g j (X) ≤ 0, j = 1, 2,...,m l j (X) = 0, j = 1, 2,..., p

14 Fundamentals of Optimization Single Objective function f(x) Maximization Minimization Design Variables, xi, i=0,1,2,3….. Constraints Inequality Equality Optimal points Local minima/maxima points: A point or Solution x* is at local point if there is no other x in its Neighborhood less than x* Global minima/maxima points: A point or Solution x** is at global point if there is no other x in entire search space less than x** Example of design variables and constraints used in optimization. Maximize X1 + 1.5 X2 Subject to: X1 + X2 ≤ 150 0.25 X1 + 0.5 X2 ≤ 50 X1 ≥ 50 X2 ≥ 25 X1 ≥0, X2 ≥0

15 Fundamentals of Optimization Global versus local optimization. Local point is equal to global point if the function is convex.

16 Fundamentals of Optimization Function f is convex if f(X a ) is less than value of the corresponding point joining f(X 1 ) and f(X 2 ). Convexity condition – Hessian 2nd order derivative) matrix of function f must be positive semi definite ( Eigen values +ve or zero). Convex and nonconvex setConvex function

17 Mathematical Background Slop or gradient of the objective function f – represent the direction in which the function will decrease/increase most rapidly Taylor series expansion Jacobian – matrix of gradient of f with respect to several variables

18 Mathematical Background Slope -First order Condition (FOC) – Provides function’s slope information Hessian – Second derivative of function of several variables, Sign indicates max.(+ve) or min.(-ve) Second order condition (SOC) Eigen values of H(X*) are all positive Determinants of all lower order of H(X*) are +ve

19 Optimization Algorithm Deterministic - specific rules to move from one iteration to next, gradient, Hessian Stochastic – probalistic rules are used for subsequent iteration Optimal Design – Engineering Design based on optimization algorithm Lagrangian method – sum of objective function and linear combination of the constraints.

20 Optimization Methods Deterministic Direct Search – Use Objective function values to locate minimum Gradient Based – first or second order of objective function. Minimization objective function f(x) is used with –ve sign – f(x) for maximization problem. Single Variable Newton – Raphson is Gradient based technique (FOC) Golden Search – step size reducing iterative method Multivariable Techniques ( Make use of Single variable Techniques specially Golden Section) Unconstrained Optimization Powell Method – Quadratic (degree 2) objective function polynomial is non-gradient based. Gradient Based – Steepest Descent (FOC) or Least Square minimum (LMS) Hessian Based -Conjugate Gradient (FOC) and BFGS (SOC)

21 Optimization Methods - Constrained Constrained Optimization Indirect approach – by transforming into unconstrained problem. Exterior Penalty Function (EPF) and Augmented Lagrange Multiplier Direct Method Sequential Linear Programming (SLP), Sequential Quadratic Programming (SQP) and steepest Generalized Reduced Gradient Method (GRG) Descent Gradient or LMS

22 Advanced Optimization Methods Global Optimization – Stochastic techniques Simulated Annealing (SA) method – minimum energy principle of cooling metal crystalline structure Genetic Algorithm (GA) – Survival of the fittest principle based upon evolutionary theory

23 Optimization Toolbox in Matlab Key features Interactive tools for defining and solving optimization problems and monitoring solution progress Solvers for nonlinear and multiobjective optimization Solvers for nonlinear least squares, data fitting, and nonlinear equations Methods for solving quadratic and linear programming problems Methods for solving binary integer programming problems Parallel computing support in selected constrained nonlinear solvers

24 How to use Optimization Toolbox Optimization Functions Function files can be directly provided by M File Syntax: [x,fval] = fminsearch(fun,x0) Optimization Tool graphical user interface (GUI) Define and modify problems quickly Use the correct syntax for optimization functions Import and export from the MATLAB workspace Generate code containing your configuration for a solver and options Change parameters of an optimization during the execution of certain Global Optimization Toolbox functions

25 Function Optimization Optimization concerns the minimization or maximization of functions Standard Optimization Problem: Equality Constraints Subject to: Inequality Constraints Side Constraints is the objective function, which measure and evaluate the performance of a system. In a standard problem, we are minimizing the function. For maximization, it is equivalent to minimization of the –ve of the objective function. Where: is a column vector of design variables, which can affect the performance of the system.

26 Function Optimization Constraints – Limitation to the design space. Can be linear or nonlinear, explicit or implicit functions Equality Constraints Inequality Constraints Side Constraints Most algorithm require less than

27 Optimization Toolbox Solvers Minimizers This group of solvers attempts to find a local minimum of the objective function near a starting point x 0. They address problems of unconstrained optimization, linear programming, quadratic programming, and general nonlinear programming. Multiobjective minimizers This group of solvers attempts to either minimize the maximum value of a set of functions (fminimax), or to find a location where a collection of functions is below some prespecified values (fgoalattain). Equation solvers This group of solvers attempts to find a solution to a scalar- or vector-valued nonlinear equation f(x) = 0 near a starting point x 0. Equation-solving can be considered a form of optimization because it is equivalent to finding the minimum norm of f(x) near x 0. Least-Squares (curve-fitting) solvers This group of solvers attempts to minimize a sum of squares. This type of problem frequently arises in fitting a model to data. The solvers address problems of finding nonnegative solutions, bounded or linearly constrained solutions, and fitting parameterized nonlinear models to data.

28 Objective Function Linear Quadratic Sum-of-squares (Least squares) Smooth nonlinear Nonsmooth

29 Constraint Type None (unconstrained) Bound Linear (including bound) General smooth Discrete (integer)

30 Select Solvers by Objective and Constraint

31 Minimization Algorithm

32 Minimization Algorithm (Cont.)

33 Equation Solving Algorithms

34 Least-Squares Algorithms

35 Implementing Optimization Toolbox Most of these optimization routines require the definition of an M-file containing the function, f, to be minimized. Maximization is achieved by supplying the routines with –f. Optimization options passed to the routines change optimization parameters. Default optimization parameters can be changed through an options structure.

36 Unconstrained Minimization Consider the problem of finding a set of values [x1 x2]T that solves Steps: Create an M-file that returns the function value (Objective Function). Call it objfun.m Then, invoke the unconstrained minimization routine. Use fminunc

37 Step 1 – Objective Function function f = objfun(x) f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1); Objective function

38 Step 2 – Invoke Routine x0 = [-1,1]; options = optimset(‘LargeScale’,’off’); [xmin,feval,exitflag,output]= fminunc(‘objfun’,x0,options); Output arguments Input arguments Starting with a guess Optimization parameters settings

39 Step 3 – Results xmin = 0.5000 -1.0000 feval = 1.3028e-010 exitflag = 1 output = iterations: 7 funcCount: 40 stepsize: 1 firstorderopt: 8.1998e-004 algorithm: 'medium-scale: Quasi-Newton line search' Minimum point of design variables Objective function value Exitflag tells if the algorithm is converged. If exitflag > 0, then local minimum is found Some other information

40 More on fminunc – Input [xmin,feval,exitflag,output,grad,hessian]= fminunc(fun,x0,options,P1,P2,…) fun: Return a function of objective function. x0: Starts with an initial guess. The guess must be a vector of size of number of design variables. Option: To set some of the optimization parameters. (More after few slides) P1,P2,…: To pass additional parameters.

41 More on fminunc – Output [xmin,feval,exitflag,output,grad,hessian]= fminunc(fun,x0,options,P1,P2,…) xmin: Vector of the minimum point (optimal point). The size is the number of design variables. feval: The objective function value of at the optimal point. exitflag: A value shows whether the optimization routine is terminated successfully. (converged if >0) Output: This structure gives more details about the optimization grad: The gradient value at the optimal point. hessian: The hessian value of at the optimal point

42 Next Class Please take your laptop and install Matlab

43


Download ppt "System Optimization (1) Liang Yu Department of Biological Systems Engineering Washington State University 04. 16. 2013."

Similar presentations


Ads by Google