Presentation is loading. Please wait.

Presentation is loading. Please wait.

Optimization methods Review Mateusz Sztangret Faculty of Metal Engineering and Industrial Computer Science Department of Applied Computer Science and Modelling.

Similar presentations


Presentation on theme: "Optimization methods Review Mateusz Sztangret Faculty of Metal Engineering and Industrial Computer Science Department of Applied Computer Science and Modelling."— Presentation transcript:

1 Optimization methods Review Mateusz Sztangret Faculty of Metal Engineering and Industrial Computer Science Department of Applied Computer Science and Modelling Krakow, r. 1

2 Outline of the presentation Basic concepts of optimization Review of optimization methods gradientless methods, gradient methods, linear programming methods, non-deterministic methods Characteristics of selected methods method of steepest descent genetic algorithm 2

3 Basic concepts of optimization Man’s longing for perfection finds expression in the theory of optimization. It studies how to describe and attain what is Best, once one knows how to measure and alter what is Good and Bad… Optimization theory encompasses the quantitative study of optima and methods for finding them. Beightler, Phillips, Wilde Foundations of Optimization 3

4 Basic concepts of optimization Optimization /optimum/ - process of finding the best solution Usually the aim of the optimization is to find better solution than previous attached 4

5 Basic concepts of optimization Specification of the optimization problem: definition of the objective function, selection of optimization variables, identification of constraints. 5

6 Mathematical definition where: x is the vector of variables, also called unknowns or parameters; f is the objective function, a (scalar) function of x that we want to maximize or minimize; g i and h i are constraint functions, which are scalar functions of x that define certain equations and inequalities that the unknown vector x must satisfy. 6

7 Set of allowed solutions Constrain functions define the set of allowed solution that is a set of points which we consider in the optimization process. X XdXd 7

8 Obtained solution Solution is called global minimum if, for all Solution is called local minimum if there is a neighbourhood N of such that for all Global minimum as well as local minimum is never exact due to limited accuracy of numerical methods and round off error 8

9 Local and global solutions f(x) x local minimum global minimum 9

10 Problems with multimodal objective function f(x) x start 10

11 Discontinuous objective function f(x) x 3 Discontinuous function 11

12 Minimum or maximum f(x) x f – f x*x* c – c 12

13 General optimization flowchart Start Set starting point x (0) Stop condition Stop i = 0 Calculate f(x (i) ) i = i + 1 x (i+1) = x (i) + Δx (i) YES NO 13

14 Stop conditions Commonly used stop conditions are as follows: obtain sufficient solution, lack of progress, reach the maximum number of iterations 14

15 Classification of optimization methods Classification of optimizing methods due to: The type of solved problem Linear programming Nonlinear optimization Constraints Optimization with constraints Optimization without constraints Size of the problem One-dimensional methods multidimensional methods The optimization criteria One-objective methods Multiobjective methods 15

16 Optimization methods The are several type of optimization algorithms: gradientless methods, –line search methods, –multidimensional methods, gradient methods, linear programming methods, non-deterministic methods 16

17 Gradientless methods Line search methods –Expansion method –Golden ratio method Multidimensional methods –Fibonacci method –Method based on Lagrange interpolation –Hooke-Jeeves method –Rosenbrock method –Nelder-Mead simplex method –Powell method 17

18 Features of gradientless methods Advantages: simplicity, they do not require computing derivatives of the objective function. Disadvantages: they find first obtained minimum they demand unimodality and continuity of objective function 18

19 Gradient methods Method of steepest descent Conjugate gradients method Newton method Davidon-Fletcher-Powell method Broyden-Fletcher-Goldfarb-Shanno method 19

20 Features of gradient methods Advantages: simplicity, greater effciency in comparsion with gradientless methods. Disadvantages: they find first obtained minimum they demand unimodality, continuity and differentiability of objective function 20

21 Linear programming If both the objective function and constraints are linear we can use one of the linear programming method: Graphical method Simplex method 21

22 Non-deterministic method Monte Carlo method Genetic algorithms Evolutionary algorithms –strategy (1 + 1) –strategy (μ + λ) –strategy (μ, λ) Particle swarm optimization Simulated annealing method Ant colony optimization Artificial immune system 22

23 Features of non-deterministic methods Advantages: any nature of optimised objective function, they do not require computing derivatives of the objective function. Disadvantages: high number of objective function calls 23

24 Optimization with constraints Ways of integrating constrains External penalty function method Internal penalty function method 24

25 Multicriteria optimization In some cases solved problem is defined by few objective function. Usually when we improve one the others get whose. weighted criteria method ideal point method 25

26 Weighted criteria method Method involves the transformation multicriterial problem into one-criterial problem by adding particular objective functions. 26

27 Ideal point method In this method we choose an ideal solution which is outside the set of allowed solution and the searching optimal solution inside the set of allowed solution which is closest the the ideal point. Distance we can measure using various metrics Ideal point Allowed solution 27

28 Method of steepest descent Algorithm consists of following steps: 1.Substitute data: –u 0 – starting point –maxit – maximum number of iterations –e – require accuracy of solution –i = 0 – iteration number 2.Compute gradient in u i 28

29 Method of steepest descent 3.Choose the search direction 4.Find optimal solution along the chosen direction (using any line search method). 5.If stop conditions are not satisfied increased i and go to step 2. 29

30 Zigzag effect Let’s consider a problem of finding minimum of function: f(u)=u u 2 2 Starting point: u 0 =[-2 3] Isolines 30

31 Genetic algorithm Algorithm consists of following steps: 1.Creation of a baseline population. 2.Compute fitness of whole population 3.Selection. 4.Crossing. 5.Mutation. 6.If stop conditions are not satisfied go to step 2. 31

32 Creation of a baseline population Genotype Objective function value (f(x)=x 2 )

33 Selection Baseline population Parents’ population

34 Roulette wheel method 34

35 Crossing Parent individual no Parent individual no crossing point Descendant individual no Descendant individual no

36 Mutation Parent individual

37 Mutation Mutation r>pm r

38 Mutation Mutation rpm r

39 Mutation Mutation r>pm r

40 Mutation Parent individual Descendant individual

41 Genetic algorithm After mutation, completion individuals are recorded in the descendant population, which becomes the baseline population for the next algorithm iteration. If obtained solution satisfies stop condition procedure is terminated. Otherwise selection, crossing and mutation are repeated. 41

42 Thank you for your attention! 42


Download ppt "Optimization methods Review Mateusz Sztangret Faculty of Metal Engineering and Industrial Computer Science Department of Applied Computer Science and Modelling."

Similar presentations


Ads by Google