Presentation is loading. Please wait.

Presentation is loading. Please wait.

Local Search Algorithms CMPT 463. When: Tuesday, April 5 3:30PM Where: RLC 105 Team based: one, two or three people per team Languages: Python, C++ and.

Similar presentations


Presentation on theme: "Local Search Algorithms CMPT 463. When: Tuesday, April 5 3:30PM Where: RLC 105 Team based: one, two or three people per team Languages: Python, C++ and."— Presentation transcript:

1 Local Search Algorithms CMPT 463

2 When: Tuesday, April 5 3:30PM Where: RLC 105 Team based: one, two or three people per team Languages: Python, C++ and Java IDEs: Python IDLE, Visual Studio, Eclipse, NetBeans Event Schedule 3:30 – 5:30 pm – Contest 5:30 pm – Award Ceremony 5:30 pm – Pizza Party Register your team online at http://goo.gl/forms/Ub65Df7pAe or in RLC 203. http://goo.gl/forms/Ub65Df7pAe Contact Dr. Tina Tian for questions.

3 Outline Introduction of local search Hill climbing search Simulated annealing Local beam search 3

4 Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution. o e.g., n-queens. o E.g., Integrated-circuit design o Job scheduling o Telecommunication network optimization o … 4

5 8-Queens Problem Put 8 queens on an 8 × 8 board with no two queens attacking each other. No two queens share the same row, column, or diagonal. 5

6 8-Queens Problem Incremental formulation Complete-state formulation 6

7 Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution. o e.g., n-queens. We can use local search algorithms: keep a single "current" state, try to improve it o generally move to neighbors o The paths are not retained 7

8 Advantages to local search Use very little memory Can often find reasonable solutions in large state spaces Useful for solving pure optimization problems o maximize goodness measure o Many do not fit in “standard” model: Darwinian evolution (Goal test? Path cost?) Local search algorithms can’t backtrack

9 Example: n-queens Move a queen to reduce number of conflicts 9

10 Hill-climbing search (steepest-ascent version) A simple loop that continuously moves in the direction of increasing value – uphill Terminates when reaches a “peak” does not look ahead beyond the immediate neighbors, does not maintain a search tree

11 8-queens problem How many successors we can derive from one state? Each state has 8*7 = 56 successors. complete-state formulation vs. incremental formulation

12 8-queens problem h = number of pairs of queens that are attacking each other, either directly or indirectly (h=0 solution) h = 17 for the above state 12

13 Hill-climbing search “Greedy local search” o grabs a good neighbor state without thinking ahead about where to go next makes rapid progress

14 Hill climbing search: 8-queens problem Only 5 steps from h = 17 to h = 1 14

15 15 What we learn hill-climbing is Usually like What we think hill-climbing looks like

16 Hill-climbing search Problem: depending on initial state, can get stuck in local maxima.

17 Problems for hill climbing A local maxima with h = 1 17

18 Problems for hill climbing Plateaux: a flat area of the state-space landscape 18

19 Hill climbing search  Starting from a randomly generated 8- queen state, steepest-ascent hill climbing gets stuck 86% of the time. It takes 4 steps on average when it succeeds and 3 when it gets stuck. The steepest ascent version halts if the best successor has the same value as the current. 19

20 Some solutions allow a sideways move o shoulder o  flat local maximum, that is not a shoulder 20

21 Some solutions Solution: a limit on the number of consecutive sideway moves o E.g., 100 consecutive sideways moves in the 8- queens problem o successful rate: raises from14% to 94% o  cost: 21 steps on average for each successful instance, 64 for each failure 21

22 Some more solutions (Variants of hill climbing ) Stochastic hill climbing o chooses at random from among the uphill moves o converge more slowly, but finds better solutions First-choice hill climbing o generates successors randomly until one is better than the current state o good when with many (thousands) of successors

23 Some more solutions (Variants of hill climbing ) Random-restart hill climbing o “If you don’t succeed, try, try again.” o Keep restarting from randomly generated initial states, stopping when goal is found

24 Simulated Annealing A hill-climbing algorithm that never makes “downhill” moves is guaranteed to be incomplete. Idea: escape local maxima by allowing some “bad” moves

25 Simulated Annealing Picks a random move (instead of the best) If “good move” accepted; else accepted with some probability The probability decreases exponentially with the “badness” of the move It also decreases as temperature “T” goes down

26 Simulated Annealing Simulated annealing was first used extensively to solve VLSI (Very-Large-Scale Integration) layout problems. It has been applied widely to factory scheduling and other large-scale optimization tasks.

27 Local Beam Search Idea: keep k states instead of 1; choose top k of all their successors Not the same as k searches run in parallel! Searches that find good states recruit other searches to join them o moves the resources to where the most progress is being made

28 Local Beam Search Problem: quite often, all k states end up on same local hill (concentrated in a small region) Idea: choose k successors randomly ( stochastic beam search )

29 Genetic Algorithms (GA) A successor state is generated by combining two parent states

30 Genetic Algorithms Start with k randomly generated states (population) Evaluation function (fitness function). Higher values for better states. Produce the next generation of states by selection, crossover, and mutation

31 Genetic algorithms Fitness function: number of non-attacking pairs of queens (min = 0, max = 8 × 7/2 = 28) 24/(24+23+20+11) = 31% 23/(24+23+20+11) = 29% etc

32 Crossover can produce a state that is a long way from either parent state.


Download ppt "Local Search Algorithms CMPT 463. When: Tuesday, April 5 3:30PM Where: RLC 105 Team based: one, two or three people per team Languages: Python, C++ and."

Similar presentations


Ads by Google