Presentation on theme: "Optimization methods Review"— Presentation transcript:
1 Optimization methods Review Mateusz SztangretFaculty of Metal Engineering and Industrial Computer ScienceDepartment of Applied Computer Science and ModellingKrakow, r.
2 Outline of the presentation Basic concepts of optimizationReview of optimization methodsgradientless methods,gradient methods,linear programming methods,non-deterministic methodsCharacteristics of selected methodsmethod of steepest descentgenetic algorithm
3 Basic concepts of optimization Man’s longing for perfection finds expression in the theory of optimization. It studies how to describe and attain what is Best, once one knows how to measure and alter what is Good and Bad… Optimization theory encompasses the quantitative study of optima and methods for finding them. Beightler, Phillips, Wilde Foundations of Optimization
4 Basic concepts of optimization Optimization /optimum/ - process of finding the best solution Usually the aim of the optimization is to find better solution than previous attached
5 Basic concepts of optimization Specification of the optimization problem:definition of the objective function,selection of optimization variables,identification of constraints.
6 Mathematical definition where:x is the vector of variables, also called unknowns or parameters;f is the objective function, a (scalar) function of x that we want to maximize or minimize;gi and hi are constraint functions, which are scalar functions of x that define certain equations and inequalities that the unknown vector x must satisfy.
7 Set of allowed solutions Constrain functions define the set of allowed solution that is a set of points which we consider in the optimization process.XXd
8 Solution is called global minimum if, for all Obtained solutionSolution is called global minimum if,for allSolution is called local minimum if there is a neighbourhood N of such thatGlobal minimum as well as local minimum is never exact due to limited accuracy of numerical methods and round off error
9 Local and global solutions f(x)local minimumglobal minimumx
10 Problems with multimodal objective function f(x)startstartx
11 Discontinuous objective function f(x)Discontinuous functionx3
13 General optimization flowchart StartSet starting point x(0)i = 0i = i + 1Calculate f(x(i))NOStop conditionx(i+1) = x(i) + Δx(i)YESStop
14 Commonly used stop conditions are as follows: obtain sufficient solution,lack of progress,reach the maximum number of iterations
15 Classification of optimization methods Classification of optimizing methods due to:The type of solved problemLinear programmingNonlinear optimizationConstraintsOptimization with constraintsOptimization without constraintsSize of the problemOne-dimensional methodsmultidimensional methodsThe optimization criteriaOne-objective methodsMultiobjective methods
16 The are several type of optimization algorithms: gradientless methods, Optimization methodsThe are several type of optimization algorithms:gradientless methods,line search methods,multidimensional methods,gradient methods,linear programming methods,non-deterministic methods
17 Multidimensional methods Gradientless methodsLine search methodsExpansion methodGolden ratio methodMultidimensional methodsFibonacci methodMethod based on Lagrange interpolationHooke-Jeeves methodRosenbrock methodNelder-Mead simplex methodPowell method
18 Features of gradientless methods Advantages:simplicity,they do not require computing derivatives of the objective function.Disadvantages:they find first obtained minimumthey demand unimodality and continuity of objective function
19 Method of steepest descent Conjugate gradients method Newton method Gradient methodsMethod of steepest descentConjugate gradients methodNewton methodDavidon-Fletcher-Powell methodBroyden-Fletcher-Goldfarb-Shanno method
20 Features of gradient methods Advantages:simplicity,greater effciency in comparsion with gradientless methods.Disadvantages:they find first obtained minimumthey demand unimodality, continuity and differentiability of objective function
21 Linear programmingIf both the objective function and constraints are linear we can use one of the linear programming method:Graphical methodSimplex method
22 Non-deterministic method Monte Carlo methodGenetic algorithmsEvolutionary algorithmsstrategy (1 + 1)strategy (μ + λ)strategy (μ, λ)Particle swarm optimizationSimulated annealing methodAnt colony optimizationArtificial immune system
23 Features of non-deterministic methods Advantages:any nature of optimised objective function,they do not require computing derivatives of the objective function.Disadvantages:high number of objective function calls
24 Optimization with constraints Ways of integrating constrainsExternal penalty function methodInternal penalty function method
25 Multicriteria optimization In some cases solved problem is defined by few objective function. Usually when we improve one the others get whose.weighted criteria methodideal point method
26 Weighted criteria method Method involves the transformationmulticriterial problem intoone-criterial problem by addingparticular objective functions.
27 Ideal point method In this method we choose an ideal solution which is outside the set of allowedsolution and the searchingoptimal solution insidethe set of allowed solutionwhich is closest thethe ideal point. Distance we canmeasure using various metricsIdeal pointAllowed solution
28 Method of steepest descent Algorithm consists of following steps:Substitute data:u0 – starting pointmaxit – maximum number of iterationse – require accuracy of solutioni = 0 – iteration numberCompute gradient in ui
29 Method of steepest descent Choose the search directionFind optimal solution along the chosen direction (using any line search method).If stop conditions are not satisfied increased i and go to step 2.
30 Zigzag effectLet’s consider a problem of finding minimum of function: f(u)=u12+3u22 Starting point: u0=[-2 3]Isolines
31 Algorithm consists of following steps: Genetic algorithmAlgorithm consists of following steps:Creation of a baseline population.Compute fitness of whole populationSelection.Crossing.Mutation.If stop conditions are not satisfied go to step 2.
32 Creation of a baseline population GenotypeObjective function value (f(x)=x2)2890072254494433124184951984
33 SelectionBaseline populationParents’ population
41 Genetic algorithmAfter mutation, completion individuals are recorded in the descendant population, which becomes the baseline population for the next algorithm iteration. If obtained solution satisfies stop condition procedure is terminated. Otherwise selection, crossing and mutation are repeated.