Presentation is loading. Please wait.

Presentation is loading. Please wait.

Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.

Similar presentations


Presentation on theme: "Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery."— Presentation transcript:

1 Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery Lecture 12

2 Optimization of thermal processes2007/2008 Overview of the lecture Overview of some modern methods of optimization Genetic algorithms (GA) Simulated annealing Neural-network-based optimization

3 Optimization of thermal processes2007/2008 Genetic algorithms (introduction) If a design problem is characterised by: mixed continuous and discrete variables discontinuous or nonconvex desing spaces (feasible regions) then standard techniques may be inefficient. It is also possible that only relative optimum closest to the starting point will be found. Feasible region Genetic algorithms (GA) in many cases can find global optimum with high probability.

4 Optimization of thermal processes2007/2008 Genetic algorithms (introduction) Charles Darwin (1809-1882) Genetic algorithms are based on Darwin’s theory of survival of the fittest (natural selection). Population of solutions 001000100101 101011100101 101001000101 101000100111 101110100101 101000101101 Reproduction 101000101101 101110100101 Only „good” solutions may reproduce. Cross-over 101000101101101110100101 1010010111 parents offspring „Good” solutions are reproduced in next generations. Mutation 1010010111 1011010111 Random change in the solution. solution Better solution

5 Optimization of thermal processes2007/2008 Genetic algorithms (introduction) Characteristics of GA: A population of trial desing vectors is used for the starting procedure (less likely to get trapped in a local optimum) GA use only the value of the objective function (direct method) Design variables represented as strings of binary variables – naturally applicable for integer programming. In the case of continuous variables, they have to be approximated with discrete ones The objective function value plays the role of fitness In every new generation (iteration): −Parents are selected at random (from sufficiently good solutions) −Crossover occurs and new solution is obtained GA is not just a random search technique – solutions with better value of objective function (better fitness) are favoured

6 Decimal number Binary number Optimization of thermal processes2007/2008 Genetic algorithms – representation of design variables In GA the design variables are represented as strings of binary numbers, 0 and 1. 1 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 1 0 0 String of length 20 Continuous variable... q binary numbers Representation of a continuous variable (resolution depends on q ):

7 Optimization of thermal processes2007/2008 Genetic algorithms – representation of objective function and constraints GA finds solution of an unconstrained problem. To solve a constrained minimization problem, two transformations have to made: transformation into uncostrained problem, i.e. with the use of penalty function method transformation into maximization of the fitness function Minimize Penalty parameter Fitness function The largest value of in the population

8 Optimization of thermal processes2007/2008 Genetic algorithms – genetic operators Population of K solutions 001000100101 101011100101 101001000101 101000100111 101110100101 101000101101 Reproduction Every solution has a value of the fitness function f1f1 f2f2... fKfK Strings are selected for reproduction with the probability: The larger the fitness function the larger probability of selection for reproduction. Note: Highly fit individuals live and reproduce Less fit individuals „die” Now, crossover occurs 1 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 1 0 0 1 1 0 1 0 1 0 1 0 1 0 1 1 0 0 1 0 1 1 0 Selected parents

9 1 0 0 1 0 0 0 1 0 1 0 1 1 0 0 1 0 1 1 0 1 1 0 1 0 1 0 0 1 1 0 0 0 0 1 0 0 1 0 0 Offspring 1 Offspring 2 Exchange of substrings Optimization of thermal processes2007/2008 Genetic algorithms – genetic operators Crossover 1 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 1 0 0 1 1 0 1 0 1 0 1 0 1 0 1 1 0 0 1 0 1 1 0 Parent 1 Parent 2 Crossover site – selected at random The new strings are placed in the new population. The process is continued.

10 Optimization of thermal processes2007/2008 Genetic algorithms – genetic operators Mutation 1 0 0 1 0 0 0 1 0 1 0 1 1 0 0 1 0 1 1 0 Occasional random alteration of a binary digit Some design vector Random location 01 10 Mutation New design vector 1 0 0 1 0 0 1 1 0 1 0 1 1 0 0 1 0 1 1 0 Mutation introduces random change in the genetic material. It helps to find global optimum.

11 Optimization of thermal processes2007/2008 Simulated annealing (introduction) Simulated annealing belongs to random search methods. However, it is designed to move toward the global minimum of the objective function. To see the drawbacks of a „naive” random search method, let’s consider the following alogrithm: 1.Choose (at random) an initial starting point X 1 2.Make random moves along each coordinate direction – go to the point X* 3.If f(X*)>f(X 1 ) reject the point X* and find a new one. Otherwise, accept the point X* as a new starting point X 1 and go to step 2. 4.Repeat, until the objective function can’t be reduced further. The problem with such an algorithm is that it may stuck in a local optimum.

12 Optimization of thermal processes2007/2008 Simulated annealing – naive random search method First step. Increase of the objective function – point rejected. Second step. Point accepted. Third step. Point accepted. We can’t leave this point Local optimum Global optimum Thus, in this version of the random search method if we find the local optimum, there is no way to leave this point.

13 Optimization of thermal processes2007/2008 Simulated annealing – the main concept With the use of simulated annealing technique transitions out of a local minimum are possible. This move is accepted unconditionally, as the objective function is reduced. This move is accepted with a probability: where Metropolis criterion Increase of objective function Temperature As. So, the largest temperature, the less constrained are the movements.

14 Optimization of thermal processes2007/2008 Simulated annealing – the main concept The algorithm starts with a high temperature (large value of T ) and in the subsequent steps the temperature is reduced slowly. The global optimum is found with a high probability even for objective function with many local minima. The change of T is defined by so called cooling schedule. The name of the method is derived from simulation of thermal annealing of solids (metals). A slow cooling of a heated solid ensures proper solidification with highly ordered crystalline structure. Rapid cooling causes defects inside the material. Simulated annealing – slow cooling Naive random search – rapid cooling Lowest internal energy – global minimum High internal energy – local optimum

15 Optimization of thermal processes2007/2008 The quality of the final solution is not affected by the initial guess (however, computational time may increase with worse starting point). The objective function doesn’t have to be regular (continuous, differentiable). The feasible region doesn’t have to be convex (the convergence is not influenced by the convexity). The method can be used to solve mixed-integer, discrete or continuous problems. For the problems with constraints modified objective function may be formulated, just as in the case of genetic algorithms (i.e. penalty function approach). Simulated annealing – some of the features

16 Optimization of thermal processes2007/2008 Neural-network-based optimization A neural network is a parallel network of interconnected simpre processors (neurons). A neuron accepts a set of inputs from other neurons and computes an output. a Single neuron The weights w i are not specified but they are determined in the learning process.

17 Optimization of thermal processes2007/2008 Neural-network-based optimization Neurons may be connected to form multilayer networks. Output layer Hidden layer Input layer Such a network may be trained to „solve” specific problems.

18 Optimization of thermal processes2007/2008 Neural-network-based optimization The strength of the various interconnections (weights) may be considered as the representation of the knowledge contained in the network The network is trained to minimize the error between the actual output of the output layer and the target output for all the input patters The training is just selecting the weigths w i The learning schemes govern as to how the wieghts are to be varied to minimize the error Possible usage: Train the network for a specific set of input patterns (supply input parameters and solutions to the given problems) Supply input parameters different from that of the training set The network should return the solution of the problem (approximate, at least)

19 Optimization of thermal processes2007/2008 Thank you for your attention


Download ppt "Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery."

Similar presentations


Ads by Google