Presentation is loading. Please wait.

Presentation is loading. Please wait.

2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Similar presentations


Presentation on theme: "2005MEE Software Engineering Lecture 11 – Optimisation Techniques."— Presentation transcript:

1 2005MEE Software Engineering Lecture 11 – Optimisation Techniques

2 Topics Optimisation Optimisation –what is it? –finite and infinite search spaces –deterministic and stochastic approaches Optimisation techniques Optimisation techniques –exhaustive searches –gradient ascent –Monte-Carlo (random) searches –genetic algorithms

3 What is Optimisation? Process of finding an optimal or sub-optimal set of parameters to solve a problem Process of finding an optimal or sub-optimal set of parameters to solve a problem For example: For example: –finding values of m,c to form line which best fits a given set of points –determining best setup for a racing car (springs, dampers, wings, ride height, etc) –maximising profits from mining or agricultural activities –travelling salesman problem –data mining applications

4 Types of Optimisation Mathematical optimisation Mathematical optimisation –optimal parameters can be calculated mathematically in fixed time –eg, finding maximum of a parabola –generally trivial problems, almost never applicable to real-world situations Numerical optimisation Numerical optimisation –using algorithms to find a near-optimal solution –quality of solution can depend upon algorithm used and chance –result is rarely guaranteed as global maximum

5 Parameter Constraints It is usually necessary to place limits on parameter values It is usually necessary to place limits on parameter values –to reduce the search space (make it finite) –to reflect physical properties This knowledge of the problem is often called a priori information This knowledge of the problem is often called a priori information Parameters without limits create an infinite search space Parameters without limits create an infinite search space –solution is possible only if function behaviour is well defined at the extrema – strictly monotonic, etc. Common sense is usually sufficient to constrain parameters in ‘real world’ applications Common sense is usually sufficient to constrain parameters in ‘real world’ applications

6 Goal of Optimisation To find the ‘best’ solution to the problem To find the ‘best’ solution to the problem –how is ‘the best’ defined? Evaluation criteria Evaluation criteria –simple for mathematical functions – highest value is best (or worst) –very difficult for many real world problems Often, near enough is good enough Often, near enough is good enough –finding ‘the best’ may be too difficult or take far too long –a solution which is near optimal may be far simpler and faster to compute

7 Optimisation Example f(x) x Calculated maximum value Actual maximum value In this example, the optimisation technique used has not found the global maximum, but rather a local maximum which is nearly as good

8 Optimisation Approaches Deterministic optimisation: Deterministic optimisation: –algorithm is not random in any way –given same search space and start conditions, result will always be the same Stochastic optimisation: Stochastic optimisation: –algorithm is partially or totally random –each run of the algorithm may give different results, even with the same input Stochastic methods are generally superior as they are less likely to be stuck in a local maximum Stochastic methods are generally superior as they are less likely to be stuck in a local maximum

9 Optimisation Algorithms General approach: General approach: –pick starting point(s) within the search space –evaluate function at this point –refine parameter estimate(s) –continue until criteria met Known as ‘iterative refinement’ Known as ‘iterative refinement’ Most optimisation algorithms use some version of this Most optimisation algorithms use some version of this Requires method of evaluation Requires method of evaluation –can be mathematical or practical –often relies on a problem model

10 Optimisation Algorithms Exhaustive search: Exhaustive search: –every possible combination of parameter values is evaluated –only possible for finite spaces –generally infeasible for most problems –accuracy is determined by granularity Monte-Carlo algorithm: Monte-Carlo algorithm: –random points are evaluated –best after specified time is chosen as optimal –more time produces better results

11 Gradient Ascent Estimates are refined by determining gradient and following upwards Estimates are refined by determining gradient and following upwards –starting points are still required (random?) –high probability of finding local maxima –can find good solutions in short time –best result is taken as optimum parameters Parameters required: Parameters required: –distance to move at each step –starting locations –# of points

12 Gradient Ascent f(x) x range of starting points which will give global maximum

13 Gradient Ascent Only small range of starting values will give global maximum Only small range of starting values will give global maximum –other starting points will give local maxima only –unsuitable for many functions and problems ‘Smoothing’ function may lead to better results ‘Smoothing’ function may lead to better results –requires knowledge of problem – how much smoothing is performed? –small peaks are removed, leaving only large peaks –range of ‘good’ starting values is increased

14 Smoothing Function f(x) x original starting range for global maximum starting range of smoothed function for global maximum

15 Genetic Algorithms Relatively new approach to optimisation Relatively new approach to optimisation –a biological evolution model is used –can lead to much better solution –able to ‘escape’ local maxima Set of points is used at each iteration Set of points is used at each iteration –next set is created using combinations of previous –chance of a point being used is dependent upon its fitness –a ‘survival of the fittest’ algorithm

16 Genetic Algorithms The algorithm: The algorithm: –choose starting set of points –repeat: evaluate fitness of each point evaluate fitness of each point choose two parents based on fitness choose two parents based on fitness combine parents using combination function combine parents using combination function possibly add random mutation possibly add random mutation repeat until new set is created repeat until new set is created –until iteration limit is reached or suitable solution found

17 Combination Functions Parameters are treated as chromosomes Parameters are treated as chromosomes –must be merged in such as way as to incorporate features of both parents –choice of merging function is critical to success Implementations: Implementations: –binary: each parameter is treated as a binary string, and bits are chosen randomly from each parent to form new bitpattern leads to problems as bits are not equal in value! leads to problems as bits are not equal in value! –parameter based: entire parameters from each parent are randomly used does not allow parameters to change does not allow parameters to change

18 Crossover Combination Based on a biological model Based on a biological model Each chromosome (parameter, or subset of parameter) pair is randomly split Each chromosome (parameter, or subset of parameter) pair is randomly split –one section of parent 1 is joined with other section of parent 2 –repeated for each chromosome

19 Mutation Mutations are random changes to parameters Mutations are random changes to parameters –generally a pre-defined probability –can be large or small changes flip bits randomly flip bits randomly small adjustment to parameter values small adjustment to parameter values Allow new solutions to be discovered that may not have been found Allow new solutions to be discovered that may not have been found –allows escape from local maxima –can also cause a good solution to become worse!

20 Choosing Parents Use the fitness score to determine probability of each point being a parent Use the fitness score to determine probability of each point being a parent –can lead to many points being completely ignored –an entire generation could be spawned by only two parents Alternative approach: Alternative approach: –rank each parent, and choose based on rank –allows even very unfit parents a small chance to reproduce –can help avoid stagnant gene pools

21 Elitism Alternative form of reproduction where ‘best’ parents move directly into next generation Alternative form of reproduction where ‘best’ parents move directly into next generation –helps prevent ‘bad’ generations –ensures solution never becomes worse at progressive generations –can lead to inbreeding (local maxima) Requires a careful choice of parental selection method Requires a careful choice of parental selection method

22 Examples http://math.hws.edu/xJava/GA/ http://math.hws.edu/xJava/GA/ http://math.hws.edu/xJava/GA/ http://www.rennard.org/alife/english/ gavgb.html http://www.rennard.org/alife/english/ gavgb.html http://www.rennard.org/alife/english/ gavgb.html http://www.rennard.org/alife/english/ gavgb.html


Download ppt "2005MEE Software Engineering Lecture 11 – Optimisation Techniques."

Similar presentations


Ads by Google