Presentation is loading. Please wait.

Presentation is loading. Please wait.

Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.

Similar presentations


Presentation on theme: "Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic."— Presentation transcript:

1 Iterative Improvement Algorithm 2012/03/20

2 Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic Algorithms

3 Iterative Improvement Algorithm General idea of local search – start with a complete configuration – make modification to improve its quality Domains: e.g., 8-queens, VLSI layout, etc. – state description  solution – path to a solution is irrelevant Approaches – Hill-Climbing (f  quality) – Gradient Descent (f  cost) – Simulated Annealing

4 Local Search Algorithm and Optimization Problems Uninformed search – Looking for a solution where solution is a path from start to goal – At each intermediate point along a path, we have no prediction of value of path Informed search – Again, looking for a path from start to goal – This time, we have insight regarding the value of intermediate solutions

5 Local Search Algorithm and Optimization Problems (cont.) What if the path is not important, just the goal? – So the goal is unknown – The path to the goal need not be solved – State space = set of complete configuration – Find configurations satisfying constraints Examples – What quantities of quarters, nickels, and dimes add up to $17.45 and minimize the total number of coins? – 8-Queen problem

6 Local Search Algorithm Local search does not keep track of previous solutions – It operates using a single current state (rather than multiple paths) and generally move only to neighbors of that state Advantages – Use a small amount of memory (usually constant amount) – They can often find reasonable (not we are not saying optimal) solutions in infinite search space

7 Optimization Problems To find the best state according to an Objective Function Example – f(q, d, n)= 1,000,000if q*0.25 + d*0.1 + n*0.05  17.45 = q + d + notherwise – To minimize f

8 Looking for Global Maximum (or Minimum) current state local maximum “flat” local maximum global maximum objective function state space shoulder

9 Hill-Climbing Search “Like climbing Everest in thick fog with amnesia” Only record the state and its evaluation instead of maintaining a search tree function H ILL -C LIMBING ( problem ) returns a state that is a local maximum inputs: problem, a problem local variables: current, a node neighbor, a node current  M AKE -N ODE ( I NITIAL -S TATE [ problem ]) loop do neighbor  a highest-valued successor of current if V ALUE [ neighbor ]  V ALUE [ current ] then return S TATE [ current ] current  neighbor

10 Hill-Climbing Search (cont.-1) Variations – choose any successor with a higher value than current – choose value[next]  value[current] Problems – Local Maxima: search halts prematurely – Plateaux: search conducts a random walk – Ridges: search oscillates with slow progress Solution: Random-Restart Hill-Climbing – start from randomly generated initial states – saving the best result so far – finding the optimal solution eventually if enough iterations are allowed

11 Hill-Climbing Search (cont.-2) Creating a sequence of local maximum that are not directly connected to each other From each local maximum, all the available actions point downhill

12 Hill-Climbing Search (cont.-3) 8-queens problem – successor function = all states generated by moving a single queen to another square in the same column 8  7 successors – h = # of pairs of queens that are attacking each other, either directly or indirectly – h = 17 for the above state

13 Hill-Climbing Search (cont.-4) 8-queens problem – a local minimum with h = 1 – every successor has a higher cost bacdefgh 8 7 6 5 4 3 2 1

14 The K-Means Algorithm 1.Choose a value for K, the total number of clusters. 2.Randomly choose K points as cluster centers. 3.Assign the remaining instances to their closest cluster center. 4.Calculate a new cluster center for each cluster. 5.Repeat steps 3-5 until the cluster centers do not change.

15 15

16 16

17 17

18 18

19 General Considerations Requires real-valued data. We must select the number of clusters present in the data. Works best when the clusters in the data are of approximately equal size. Attribute significance cannot be determined. Lacks explanation capabilities.

20 Simulated Annealing idea: escape local maxima by allowing some bad moves but gradually decrease their frequency function S IMULATED -A NNEALING ( problem, schedule ) returns a solution state inputs: problem, a problem schedule, a mapping from time to “temperature” local variables: current, next, a node T, a “temperature” controlling the probability of downward steps current  M AKE -N ODE ( I NITIAL -S TATE [ problem ]) for t  1 to  do T  schedule [ t ] if T = 0 then return current next  a randomly selected successor of current  E  V ALUE [ next ] - V ALUE [ current ] if  E > 0 then current  next else current  next only with probability e  E / T

21 Simulated Annealing (cont.-1) A term borrowed from metalworking We want metal molecules to find a stable location relative to neighbors Heating causes metal molecules to move around and to take on undesirable locations During cooling, molecules reduce their movement and settle into a more stable position Annealing is process of heating metal and letting it cool slowly to lock in the stable locations of the molecules

22 Simulated Annealing (cont.-2) Select a random move at each iteration Move to the selected node if it is better than the current node The probability of moving to a worse node decreases exponentially with the badness of the move, i.e., e ΔE/T The temperature T changes according to a schedule

23 Property of Simulated Annealing One can prove: If T decreases slowly enough, then simulated annealing search will find a global optimum with probability approaching 1 Wildly used in VLSI layout, airline scheduling, etc.

24 Local Beam Search Keep track of k states rather than just one Begin with k randomly generated states At each iteration, all the successors of all k states are generated If any one is a goal state, halt; else select the k best successors from the complete list and repeat Useful information is passed among the k parallel search thread

25 Genetic Algorithm (GA) A successor state is generated by combining two parent states Start with k randomly generated states (population) A state is represented as a string over a finite alphabet (often a string of 0s and 1s) Evaluation function (fitness function). Higher values for better states Produce the next generation of states by selection, crossover, and mutation

26 Genetic Algorithm (cont.) Fitness function: # of non-attacking pair of queens (min = 0, max = 8×7/2 = 28) Probability for selected for rep  24/(24+23+20+11) = 31%  23/(24+23+20+11) = 29%, etc Initial Population Fitness Fn Selection Crossover Mutation 3 2 7 5 2 4 1 12 4 7 4 8 5 5 23 2 7 4 8 5 5 2


Download ppt "Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic."

Similar presentations


Ads by Google