Presentation is loading. Please wait.

Presentation is loading. Please wait.

Optimization with MATLAB

Similar presentations


Presentation on theme: "Optimization with MATLAB"— Presentation transcript:

1 Optimization with MATLAB
A Hands-On Workshop Farrukh Nagi Universiti Teknologi MARA , Shah Alam Campus 25-26 June, 2014

2 Introduction To Non–linear Optimization
PART I

3 PART 1 – INTRODUCTION TO OPTIMIZATION
Contents PART 1 – INTRODUCTION TO OPTIMIZATION Introduction to Optimization Fundamental of Optimization Mathematical Background Unconstrained Optimization Methods Gradient Descent Methods (Steepest Descent) Least Square Methods Simplex Methods 4. Constrained Optimization

4 PART 2 - MATLAB OPTIMIZATION
Matlab/Simulink Optimization Methods Function Optimization Minimization Algorithms Unconstrained Optimization – fminunc, fminsearch Optimization Options Settings Constrained Optimization Multi-objective Optimization - lsqnonlin Optimization Toolbox >>optimtool %GUI Environmental Science Optimization 4.

5 PART 3 - SIMULINK OPTIMIZATION
RESPONSE /IEEE_optim/… Parametric Modeling Parameter Identification Parameter Passing with Component Block Input Simulink Optimization Design (SOD) – GUI 9. Optimizer Output 10. Simulink Examples List REFERENCES

6 What is Optimization? Optimization is an iterative process
by which a desired solution(max/min) of the problem can be found while satisfying all its constraint or bounded conditions. Figure 2: Optimum solution is found while satisfying its constraint (derivative must be zero at optimum). Optimization problem could be linear or non-linear. Non –linear optimization is accomplished by numerical ‘Search Methods’. Search methods are used iteratively before a solution is achieved. The search procedure is termed as algorithm.

7 Optimization Methods One-Dimensional Unconstrained Optimization Golden-Section Search Quadratic Interpolation Newton's Method Multi-Dimensional Unconstrained Optimization Non-gradient or direct methods Gradient methods Linear Programming (Constrained) Graphical Solution Simplex Method Genetic Algorithm (GA) – Survival of the fittest principle based upon evolutionary theory \IEEE_OPTIM_2012\GA\GA Presentation.ppt Particle Swarm Optimization (PSO) – Concept of best solution in the neighborhood : \IEEE_OPTIM_2012\PSO\ PSO Presentation.ppt Others …….

8 Fundamentals of Non-Linear Optimization
The solution of the linear problem lies on boundaries of the feasible region. Figure 4: Three dimensional solution of non-linear problem Figure 3: Solution of linear problem Non-linear problem solution lies within and on the boundaries of the feasible region.

9 Fundamentals of Non-Linear Optimization
…Contd Single Objective function f(x) Maximization Minimization Maximize X X2 Subject to: X1 + X2 ≤ X X2 ≤ 50 X1 ≥ 50 X2 ≥ 25 X1 ≥0, X2 ≥0 Design Variables, xi , i=0,1,2,3….. Constraints Inequality Equality Figure 5: Example of design variables and constraints used in non-linear optimization. Optimal points Local minima/maxima points: A point or Solution x* is at local point if there is no other x in its Neighborhood less than x* Global minima/maxima points: A point or Solution x** is at global point if there is no other x in entire search space less than x**

10 Fundamentals of Non-Linear Optimization
…Contd Figure 7: Local point is equal to global point if the function is convex. Figure 6: Global versus local optimization. A set S is convex if the line segment joining any two points in the set is also in the set. convex not convex convex not convex not convex

11 Fundamentals of Non-Linear Optimization
…Contd Function f is convex if f(Xa) is less than value of the corresponding point joining f(X1) and f(X2). Convexity condition – Hessian 2nd order derivative) matrix of function f must be positive semi definite ( eigen values +ve or zero). Figure 8: Convex and nonconvex set Figure 9: Convex function

12 Optimality Conditions
First order Condition (FOC) Hessian – Second derivative of f of several variables Second order condition (SOC) Eigen values of H(X*) are all positive Determinants of all lower order of H(X*) are +ve

13 Optimization Methods …Constrained
a.) Indirect approach – by transforming into unconstrained problem. b.) Exterior Penalty Function (EPF) and Augmented Lagrange Multiplier c.) Direct Method Sequential Linear Programming (SLP), SQP and Steepest Generalized Reduced Gradient Method (GRG) Figure 10: Descent Gradient or LMS

14 Steepest descent method
Math 685/CSI 700 Spring 08 Steepest descent method George Mason University, Department of Mathematical Sciences

15 Example

16 Example (cont.)

17 Matlab steepest descent- fminunc
%startpresentgrad.m clc clf hold off x0=[5,1]; options = %for graphic display options=optimset('LargeScale','off','Display','iter-detailed'); %for iteration options=optimset(options,'GradObj','on'); %for enabling gradient object options = function [f,g]=myfuncon2(x) function stop = outfun(x, optimValues, state) f=0.5*x(1)^2+2.5*x(2)^2; ww1= -6:0.05:6; if nargout > ww2=ww1; g(1)=x(1); %gradient 1 supplied [w1,w2]=meshgrid(ww1,ww2); g(2)=5*x(2); %gradient2 supplied J=-1*(0.5*w1.^2+2.5*w2.^2); foo=max(abs(g)); cs=contour(w1,w2,J,20); hold on grid End stop=false %cs=surf(w1,w2,J); hold on; Grid plot(x(1),x(2),’bl+’); drawnow

18 Non-Linear least squares

19 Non-Linear least squares

20 Simplex Methods Minimize

21 Derivative-free optimization
Downhill simplex method

22 Example: constrained optimization

23 Example (cont.)

24 Example (cont.)

25 Environmental Sciences Optimization Case Studies
1. Fish Harvesting…/Utim/fishharvester/ 2. River Pollution…../Utim/WWTP_RivPol/ 3. Noise Pollution…./Utim/machine_noise/ Data Fitting 1. Hydrology .../Utim/hydrology/ 2. Anthropometric.../Utim/anthropometry/

26 MATLAB Optimization PART II

27 >>Command Window Simulink Design Optimization
MATLAB/SIMULINK OPTIMIZATION METHODS Model Block Ports/ Block update @functions Scripts M-Files Custom code MATLAB >>Command Window >>Optimtool -GUI- Simulink Design Optimization Simulink Model.mdl

28 Function Optimization
Optimization concerns the minimization or maximization of functions Standard Optimization Problem: Subject to: Equality Constraints Inequality Constraints Side Constraints Where: is the objective function, which measure and evaluate the performance of a system. In a standard problem, we are minimizing the function. For maximization, it is equivalent to minimization of the –ve of the objective function. is a column vector of design variables, which can affect the performance of the system.

29 Function Optimization (Cont.)
Constraints – Limitation to the design space. Can be linear or nonlinear, explicit or implicit functions Equality Constraints Inequality Constraints Most algorithm require less than!!! Side Constraints

30 Optimization Toolbox Is a collection of functions that extend the capability of MATLAB. The toolbox includes routines for: Unconstrained optimization Constrained nonlinear optimization, including goal attainment problems, minimax problems, and semi-infinite minimization problems Quadratic and linear programming Nonlinear least squares and curve fitting Nonlinear systems of equations solving Constrained linear least squares Specialized algorithms for large scale problems

31 Unconstrained Minimization
Consider the problem of finding a set of values [x1 x2]T that solves Steps: Create an M-file that returns the function value (Objective Function). Call it objfun.m Then, invoke the unconstrained minimization routine. Use fminunc

32 Step 1 – Obj. Function Objective function function f = objfun(x)
f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1); Objective function

33 Step 2 – Invoke Routine Starting with a guess x0 = [-1,1];
options = optimset(‘LargeScale’,’off’); [xmin,feval,exitflag,output]= fminunc(‘objfun’,x0,options); Optimization parameters settings Output arguments Input arguments

34 Results Minimum point of design variables Objective function value
xmin = feval = e-010 exitflag = 1 output = iterations: 7 funcCount: 40 stepsize: 1 firstorderopt: e-004 algorithm: 'medium-scale: Quasi-Newton line search' Minimum point of design variables Objective function value Exit flag tells if the algorithm is converged. If exit flag > 0, then local minimum is found Some other information

35 More on fminunc – Input [xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,…) fun : Return a function of objective function. x0 : Starts with an initial guess. The guess must be a vector of size of number of design variables. Option : To set some of the optimization parameters. (More after few slides) P1,P2,… : To pass additional parameters.

36 More on fminunc – Output
[xmin,feval,exitflag,output,grad,hessian]= fminunc(fun,x0,options,P1,P2,…) xmin : Vector of the minimum point (optimal point). The size is the number of design variables. feval : The objective function value of at the optimal point. exitflag : A value shows whether the optimization routine is terminated successfully. (converged if >0) Output : This structure gives more details about the optimization grad : The gradient value at the optimal point. hessian : The hessian value of at the optimal point

37 Options Setting – optimset
optimset(‘param1’,value1, ‘param2’,value2,…) The routines in Optimization Toolbox has a set of default optimization parameters. However, the toolbox allows you to alter some of those parameters, for example: the tolerance, the step size, the gradient or hessian values, the max. number of iterations etc. There are also a list of features available, for example: displaying the values at each iterations, compare the user supply gradient or hessian, etc. You can also choose the algorithm you wish to use.

38 Options Setting (Cont.)
optimset(‘param1’,value1, ‘param2’,value2,…) Type help optimset in command window, a list of options setting available will be displayed. How to read? For example: LargeScale - Use large-scale algorithm if possible [ {on} | off ] The default is with { } Value (value1) Parameter (param1)

39 Options Setting (Cont.)
optimset(‘param1’,value1, ‘param2’,value2,…) LargeScale - Use large-scale algorithm if possible [ {on} | off ] Since the default is on, if we would like to turn off, we just type: Options = optimset(‘LargeScale’, ‘off’) and pass to the input of fminunc.

40 Useful Option Settings
Highly recommended to use!!! Display - Level of display [ off | iter | notify | final ] MaxIter - Maximum number of iterations allowed [ positive integer ] TolCon - Termination tolerance on the constraint violation [ positive scalar ] TolFun - Termination tolerance on the function value [ positive scalar ] TolX - Termination tolerance on X [ positive scalar ]

41 fminunc and fminsearch
fminunc uses algorithm with gradient and hessian information. Two modes: Large-Scale: interior-reflective Newton Medium-Scale: quasi-Newton (BFGS) Not preferred in solving highly discontinuous functions. This function may only give local solutions.. fminsearch is generally less efficient than fminunc for problems of order greater than two. However, when the problem is highly discontinuous, fminsearch may be more robust. This is a direct search method that does not use numerical or analytic gradients as in fminunc. This function may only give local solutions.

42 Constrained Minimization
Vector of Lagrange Multiplier at optimal point [xmin,feval,exitflag,output,lambda,grad,hessian] = fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…)

43 Example function f = myfun(x) f=-x(1)*x(2)*x(3); Subject to:

44 Example (Cont.) For Create a function call nonlcon which returns 2 constraint vectors [C,Ceq] function [C,Ceq]=nonlcon(x) C=2*x(1)^2+x(2); Ceq=[]; Remember to return a null Matrix if the constraint does not apply

45 fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…)
Example (Cont.) Initial guess (3 design variables) x0=[10;10;10]; A=[ ;1 2 2]; B=[0 72]'; LB = [0 0 0]'; UB = [ ]'; CAREFUL!!! fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…)

46 Example (Cont.) Const. 1 Const. 2 Const. 3 Const. 4 Const. 5 Const. 6
Warning: Large-scale (trust region) method does not currently solve this type of problem, switching to medium-scale (line search). > In D:\Programs\MATLAB6p1\toolbox\optim\fmincon.m at line 213 In D:\usr\CHINTANG\OptToolbox\min_con.m at line 6 Optimization terminated successfully: Magnitude of directional derivative in search direction less than 2*options.TolFun and maximum constraint violation is less than options.TolCon Active Constraints: 2 9 x = 0.00 16.231 feval = e-025 Const. 1 Const. 2 Const. 3 Const. 4 Const. 5 Const. 6 Const. 7 Sequence: A,B,Aeq,Beq,LB,UB,C,Ceq Const. 8 Const. 9

47 >> optimtool - % GUI

48 Constrained Optimization
An optimization algorithm is large scale when it uses linear algebra that does not need to store, nor operate on, full matrices. In contrast, medium-scale methods internally create full matrices and use dense linear algebra The definition is based on the Karush-Kuhn-Tucker (KKT) conditions. The KKT conditions are analogous to the condition that the gradient must be zero at a minimum, modified to take constraints into account. The difference is that the KKT conditions hold for constrained problems. The KKT conditions use the auxiliary Lagrangian function:

49 fmincon Algorithms fmincon has four algorithm options: a) interior-point, b) active-set, c) SQP, d) trust-region-reflective a) An Interior point method is a linear or nonlinear programming method (Forsgren et al. 2002) that achieves optimization by going through the middle of the solid defined by the problem rather than around its surface b) Active Set Approach Equality constraints always remain in the active set Sk. The search direction dk is calculated and minimizes the objective function while remaining on active constraint boundaries.

50 Sequential Quadratic Programming and Trust Region
c) Sequential quadratic programming (SQP): is an iterative method for nonlinear optimization . SQP methods are used on problems for which the objective function and the constraints are twice continuously differentiable. If the problem is unconstrained, then the method reduces to Newton's method for finding a point where the gradient of the objective vanishes. If the problem has only equality constraints, then the method is equivalent to applying Newton's method to the first-order optimality conditions, or Karush–Kuhn–Tucker conditions , of the problem. d) Trust-Region Reflective: The basic idea is to approximate f with a simpler function q, which reasonably reflects the behavior of function f in a neighborhood N around the point x. This neighborhood is the trust region. A trial step s is computed by minimizing (or approximately minimizing) over N. This is the trust-region sub problem The current point is updated to be x + s if f(x + s) < f(x); otherwise, the current point remains unchanged and N, the region of trust, is shrunk and the trial step computation is repeated.

51 lsqnonlin in Matlab – Multi Objective Curve fitting
clc; %recfit.m clear; global data; data= [ ]; % experimental data,`1st coloum x, 2nd coloum R x=data(:,1); Rexp=data(:,2); plot(x,Rexp,'ro'); % plot the experimental data hold on b0=[ ]; % start values for the parameters b=lsqnonlin('recfun',b0) % run the lsqnonlin with start value b0, returned parameter values stored in b Rcal=1./(1+exp(1.0986/b(1)*(x-b(2)))); % calculate the fitted value with parameter b plot(x,Rcal,'b'); % plot the fitted value on the same graph Find b1 and b2 >>recfit >>b = %recfun.m function y=recfun(b) global data; x=data(:,1); Rexp=data(:,2); Rcal=1./(1+exp(1.0986/b(1)*(x-b(2)))); % the calculated value from the model %y=sum((Rcal-Rexp).^2); y=Rcal-Rexp; % the sum of the square of the difference %between calculated value and experimental value Link to this Page

52 SIMULINK OPTIMIZATION RESPONSE
PART III

53 SIMULINK & OPTIMIZATION DESIGN
SIMULINK MODEL OPTIMIZATION Simulink Block Objective function (PSO,GA)

54 SIMULINK MODEL FILE – Ports input/output /Utim_Optim/PSO/Live_fn_sim
SIMULINK MODEL FILE – Ports input/output /Utim_Optim/PSO/Live_fn_sim.mdl /Utim_Optim/PSO/PSO.m >> PSO % start program, Line 40 current_fitness(i) = Live_fn(current_position(:,i));

55 Genetic Algorithm –Objective function
In-Port Parameter Optimized Out-Port Parameter Optimized /Utim_Optim/GA/Live_fn_simGA.mdl

56 SIMULINK OPTIMIZATION EXAMPLES
PARTICLE SWARM OPTIMIZATION Particle Swarm Optimization function (\Utim\PSO\Live_fn_sim.mdl) ___________________________________________________________________ B. GENETIC ALGORITHM GA function optimization (\Utim\GA\Live_fn_simGA.mdl) C. Fish Harvesting Simulink Optimization Design (\Utim\fishharvester\fishpond2.mdl)

57 4. DC Motor Parameter Optimization with GA
(IEEE_OPTIM_2012\Diodes_amps\lead_screw_model_opt_Motor_ga.mdl)\ ___________________________________________________________________ C. TRADITIONAL OPTIMIZATION METHODS 5

58 8. Buck Boost converter optimization
(IEEE_OPTIM_2012\Buck_Boost\Buck_PWM_OPT12_Fs_Optim.mdl) (IEEE_OPTIM_2012\Buck_Boost\Buck_PWM_OPT12_Ti_L.mdl) ___________________________________________________________________ 9. Operational Amplifier Optimization (IEEE_OPTIM_2012\Diodes_amps\Op_amp_opt.mdl) 10. Diode Voltage Doubler (IEEE_OPTIM_2012\Diodes_amps\diode_2.mdl D. REAL DATA PARAMETERIZING 11. VCB Motor Parameter Optimization (IEEE_OPTIM_2012\lead screw\vcb_motor_opt_Motor_trk.mdl) 12. Parmeterizing Gas turbine Fuel Positioner (IEEE_OPTIM_2012\Gas_bio_Turbine\GT_Biofuel_present_b.mdl) _____________________________________________

59 REFERENCES 1. Optimization toolbox for use with MATLAB, User Guide, The MathWorks Inc Applied Optimization with MATLAB Programming, P. Venkataraman, Wiley Inter Science, Optimization for Engineering Design, Kalyanmoy Deb, Prentice Hall, Convex Optimization, Stephen Boyd and Lieven Vandenberghe, CUP Numerical Recipes in C (or C++): The Art of Scientific Computing, W. H. Press, Brain P. F. Saul A. T. W. T. Vetterling CUP ,1992/


Download ppt "Optimization with MATLAB"

Similar presentations


Ads by Google