2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Slides:



Advertisements
Similar presentations
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Advertisements

Local Search Algorithms
Genetic Algorithm.
Intelligent Control Methods Lecture 12: Genetic Algorithms Slovak University of Technology Faculty of Material Science and Technology in Trnava.
Evolutionary Computing A Practical Introduction Presented by Ben Paechter Napier University with thanks to the EvoNet Training Committee and its “Flying.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Evolutionary Computational Intelligence
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Chapter 14 Genetic Algorithms.
7/2/2015Intelligent Systems and Soft Computing1 Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Genetic Algorithms: A Tutorial
Genetic Algorithm.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Evolutionary Intelligence
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
Efficient Model Selection for Support Vector Machines
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Genetic algorithms Prof Kang Li
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
By Prafulla S. Kota Raghavan Vangipuram
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Genetic algorithms Charles Darwin "A man who dares to waste an hour of life has not discovered the value of life"
Applying Genetic Algorithm to the Knapsack Problem Qi Su ECE 539 Spring 2001 Course Project.
1 “Genetic Algorithms are good at taking large, potentially huge search spaces and navigating them, looking for optimal combinations of things, solutions.
1 Chapter 14 Genetic Algorithms. 2 Chapter 14 Contents (1) l Representation l The Algorithm l Fitness l Crossover l Mutation l Termination Criteria l.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
EE459 I ntroduction to Artificial I ntelligence Genetic Algorithms Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
© Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
Learning by Simulating Evolution Artificial Intelligence CSMC February 21, 2002.
 Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms n Introduction, or can evolution be intelligent? n Simulation.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
Local Search and Optimization Presented by Collin Kanaley.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Genetic Algorithms. The Basic Genetic Algorithm 1.[Start] Generate random population of n chromosomes (suitable solutions for the problem) 2.[Fitness]
Optimization Problems
Waqas Haider Bangyal 1. Evolutionary computing algorithms are very common and used by many researchers in their research to solve the optimization problems.
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Genetic Algorithms MITM613 (Intelligent Systems).
Genetic Algorithms Chapter Description of Presentations
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Genetic Algorithms. Solution Search in Problem Space.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm(GA)
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Genetic (Evolutionary) Algorithms CEE 6410 David Rosenberg “Natural Selection or the Survival of the Fittest.” -- Charles Darwin.
Optimization Problems
Introduction to Genetic Algorithms
Chapter 14 Genetic Algorithms.
Genetic Algorithms.
USING MICROBIAL GENETIC ALGORITHM TO SOLVE CARD SPLITTING PROBLEM.
Intelligent Systems and Soft Computing
Artificial Intelligence (CS 370D)
Optimization Problems
Boltzmann Machine (BM) (§6.4)
Searching for solutions: Genetic Algorithms
Population Based Metaheuristics
Presentation transcript:

2005MEE Software Engineering Lecture 11 – Optimisation Techniques

Topics Optimisation Optimisation –what is it? –finite and infinite search spaces –deterministic and stochastic approaches Optimisation techniques Optimisation techniques –exhaustive searches –gradient ascent –Monte-Carlo (random) searches –genetic algorithms

What is Optimisation? Process of finding an optimal or sub-optimal set of parameters to solve a problem Process of finding an optimal or sub-optimal set of parameters to solve a problem For example: For example: –finding values of m,c to form line which best fits a given set of points –determining best setup for a racing car (springs, dampers, wings, ride height, etc) –maximising profits from mining or agricultural activities –travelling salesman problem –data mining applications

Types of Optimisation Mathematical optimisation Mathematical optimisation –optimal parameters can be calculated mathematically in fixed time –eg, finding maximum of a parabola –generally trivial problems, almost never applicable to real-world situations Numerical optimisation Numerical optimisation –using algorithms to find a near-optimal solution –quality of solution can depend upon algorithm used and chance –result is rarely guaranteed as global maximum

Parameter Constraints It is usually necessary to place limits on parameter values It is usually necessary to place limits on parameter values –to reduce the search space (make it finite) –to reflect physical properties This knowledge of the problem is often called a priori information This knowledge of the problem is often called a priori information Parameters without limits create an infinite search space Parameters without limits create an infinite search space –solution is possible only if function behaviour is well defined at the extrema – strictly monotonic, etc. Common sense is usually sufficient to constrain parameters in ‘real world’ applications Common sense is usually sufficient to constrain parameters in ‘real world’ applications

Goal of Optimisation To find the ‘best’ solution to the problem To find the ‘best’ solution to the problem –how is ‘the best’ defined? Evaluation criteria Evaluation criteria –simple for mathematical functions – highest value is best (or worst) –very difficult for many real world problems Often, near enough is good enough Often, near enough is good enough –finding ‘the best’ may be too difficult or take far too long –a solution which is near optimal may be far simpler and faster to compute

Optimisation Example f(x) x Calculated maximum value Actual maximum value In this example, the optimisation technique used has not found the global maximum, but rather a local maximum which is nearly as good

Optimisation Approaches Deterministic optimisation: Deterministic optimisation: –algorithm is not random in any way –given same search space and start conditions, result will always be the same Stochastic optimisation: Stochastic optimisation: –algorithm is partially or totally random –each run of the algorithm may give different results, even with the same input Stochastic methods are generally superior as they are less likely to be stuck in a local maximum Stochastic methods are generally superior as they are less likely to be stuck in a local maximum

Optimisation Algorithms General approach: General approach: –pick starting point(s) within the search space –evaluate function at this point –refine parameter estimate(s) –continue until criteria met Known as ‘iterative refinement’ Known as ‘iterative refinement’ Most optimisation algorithms use some version of this Most optimisation algorithms use some version of this Requires method of evaluation Requires method of evaluation –can be mathematical or practical –often relies on a problem model

Optimisation Algorithms Exhaustive search: Exhaustive search: –every possible combination of parameter values is evaluated –only possible for finite spaces –generally infeasible for most problems –accuracy is determined by granularity Monte-Carlo algorithm: Monte-Carlo algorithm: –random points are evaluated –best after specified time is chosen as optimal –more time produces better results

Gradient Ascent Estimates are refined by determining gradient and following upwards Estimates are refined by determining gradient and following upwards –starting points are still required (random?) –high probability of finding local maxima –can find good solutions in short time –best result is taken as optimum parameters Parameters required: Parameters required: –distance to move at each step –starting locations –# of points

Gradient Ascent f(x) x range of starting points which will give global maximum

Gradient Ascent Only small range of starting values will give global maximum Only small range of starting values will give global maximum –other starting points will give local maxima only –unsuitable for many functions and problems ‘Smoothing’ function may lead to better results ‘Smoothing’ function may lead to better results –requires knowledge of problem – how much smoothing is performed? –small peaks are removed, leaving only large peaks –range of ‘good’ starting values is increased

Smoothing Function f(x) x original starting range for global maximum starting range of smoothed function for global maximum

Genetic Algorithms Relatively new approach to optimisation Relatively new approach to optimisation –a biological evolution model is used –can lead to much better solution –able to ‘escape’ local maxima Set of points is used at each iteration Set of points is used at each iteration –next set is created using combinations of previous –chance of a point being used is dependent upon its fitness –a ‘survival of the fittest’ algorithm

Genetic Algorithms The algorithm: The algorithm: –choose starting set of points –repeat: evaluate fitness of each point evaluate fitness of each point choose two parents based on fitness choose two parents based on fitness combine parents using combination function combine parents using combination function possibly add random mutation possibly add random mutation repeat until new set is created repeat until new set is created –until iteration limit is reached or suitable solution found

Combination Functions Parameters are treated as chromosomes Parameters are treated as chromosomes –must be merged in such as way as to incorporate features of both parents –choice of merging function is critical to success Implementations: Implementations: –binary: each parameter is treated as a binary string, and bits are chosen randomly from each parent to form new bitpattern leads to problems as bits are not equal in value! leads to problems as bits are not equal in value! –parameter based: entire parameters from each parent are randomly used does not allow parameters to change does not allow parameters to change

Crossover Combination Based on a biological model Based on a biological model Each chromosome (parameter, or subset of parameter) pair is randomly split Each chromosome (parameter, or subset of parameter) pair is randomly split –one section of parent 1 is joined with other section of parent 2 –repeated for each chromosome

Mutation Mutations are random changes to parameters Mutations are random changes to parameters –generally a pre-defined probability –can be large or small changes flip bits randomly flip bits randomly small adjustment to parameter values small adjustment to parameter values Allow new solutions to be discovered that may not have been found Allow new solutions to be discovered that may not have been found –allows escape from local maxima –can also cause a good solution to become worse!

Choosing Parents Use the fitness score to determine probability of each point being a parent Use the fitness score to determine probability of each point being a parent –can lead to many points being completely ignored –an entire generation could be spawned by only two parents Alternative approach: Alternative approach: –rank each parent, and choose based on rank –allows even very unfit parents a small chance to reproduce –can help avoid stagnant gene pools

Elitism Alternative form of reproduction where ‘best’ parents move directly into next generation Alternative form of reproduction where ‘best’ parents move directly into next generation –helps prevent ‘bad’ generations –ensures solution never becomes worse at progressive generations –can lead to inbreeding (local maxima) Requires a careful choice of parental selection method Requires a careful choice of parental selection method

Examples gavgb.html gavgb.html gavgb.html gavgb.html