Local Search. Systematic versus local search u Systematic search  Breadth-first, depth-first, IDDFS, A*, IDA*, etc  Keep one or more paths in memory.

Slides:



Advertisements
Similar presentations
Informed search algorithms
Advertisements

Local Search Algorithms
Artificial Intelligence Presentation
Heuristics CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
LOCAL SEARCH AND CONTINUOUS SEARCH. Local search algorithms  In many optimization problems, the path to the goal is irrelevant ; the goal state itself.
Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) February, 9, 2009.
CSC344: AI for Games Lecture 5 Advanced heuristic search Patrick Olivier
Local search algorithms
Local search algorithms
Two types of search problems
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 1 Ryan Kinworthy CSCE Advanced Constraint Processing.
Introduction to Artificial Intelligence Local Search (updated 4/30/2006) Henry Kautz.
Informed Search CSE 473 University of Washington.
9/5 9/5: (today) Lisp Assmt due 9/6: 3:30pm: Lisp Recitation [Lei] 9/7:~6pm: HW/Class recitation [Will] 9/12: HW1 Due.
Trading optimality for speed…
Imagine that I am in a good mood Imagine that I am going to give you some money ! In particular I am going to give you z dollars, after you tell me the.
Review Best-first search uses an evaluation function f(n) to select the next node for expansion. Greedy best-first search uses f(n) = h(n). Greedy best.
CSC344: AI for Games Lecture 4: Informed search
Informed Search Next time: Search Application Reading: Machine Translation paper under Links Username and password will be mailed to class.
1 CS 2710, ISSP 2610 R&N Chapter 4.1 Local Search and Optimization.
Local Search CS311 David Kauchak Spring 2013 Some material borrowed from: Sara Owsley Sood and others.
Local Search and Optimization
Vilalta&Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
Artificial Intelligence CS482, CS682, MW 1 – 2:15, SEM 201, MS 227 Prerequisites: 302, 365 Instructor: Sushil Louis,
Search CSE When you can’t use A* Hill-climbing Simulated Annealing Other strategies 2 person- games.
INTRODUÇÃO AOS SISTEMAS INTELIGENTES Prof. Dr. Celso A.A. Kaestner PPGEE-CP / UTFPR Agosto de 2011.
An Introduction to Artificial Life Lecture 4b: Informed Search and Exploration Ramin Halavati In which we see how information.
Informed Search Uninformed searches easy but very inefficient in most cases of huge search tree Informed searches uses problem-specific information to.
Local Search Algorithms This lecture topic Chapter Next lecture topic Chapter 5 (Please read lecture topic material before and after each lecture.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Informed search algorithms
Informed search algorithms
Informed search algorithms Chapter 4. Outline Best-first search Greedy best-first search A * search Heuristics.
Informed search algorithms Chapter 4. Best-first search Idea: use an evaluation function f(n) for each node –estimate of "desirability"  Expand most.
1 CS 2710, ISSP 2610 Chapter 4, Part 2 Heuristic Search.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Artificial Intelligence for Games Online and local search
C OMPARING T HREE H EURISTIC S EARCH M ETHODS FOR F UNCTIONAL P ARTITIONING IN H ARDWARE -S OFTWARE C ODESIGN Theerayod Wiangtong, Peter Y. K. Cheung and.
Local Search Algorithms
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
For Wednesday Read chapter 6, sections 1-3 Homework: –Chapter 4, exercise 1.
For Wednesday Read chapter 5, sections 1-4 Homework: –Chapter 3, exercise 23. Then do the exercise again, but use greedy heuristic search instead of A*
Princess Nora University Artificial Intelligence Chapter (4) Informed search algorithms 1.
Local Search and Optimization Presented by Collin Kanaley.
4/11/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 4, 4/11/2005 University of Washington, Department of Electrical Engineering Spring 2005.
Feng Zhiyong Tianjin University Fall  Best-first search  Greedy best-first search  A * search  Heuristics  Local search algorithms  Hill-climbing.
Local search algorithms In many optimization problems, the state space is the space of all possible complete solutions We have an objective function that.
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
Chapter 4 (Section 4.3, …) 2 nd Edition or Chapter 4 (3 rd Edition) Local Search and Optimization.
Local Search Algorithms and Optimization Problems
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) Oct, 11, 2013.
CPSC 420 – Artificial Intelligence Texas A & M University Lecture 5 Lecturer: Laurie webster II, M.S.S.E., M.S.E.e., M.S.BME, Ph.D., P.E.
Department of Computer Science Lecture 5: Local Search
Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld…) 1.
Local Search Algorithms CMPT 463. When: Tuesday, April 5 3:30PM Where: RLC 105 Team based: one, two or three people per team Languages: Python, C++ and.
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
Games: Expectimax MAX MIN MAX Prune if α ≥ β. Games: Expectimax MAX MIN MAX
Genetic Algorithms.
CSCI 4310 Lecture 10: Local Search Algorithms
For Monday Chapter 6 Homework: Chapter 3, exercise 7.
Local Search Algorithms
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Stochastic Local Search Variants Computer Science cpsc322, Lecture 16
Heuristics Local Search
Local Search Algorithms
Beyond Classical Search
Local Search Algorithms
Presentation transcript:

Local Search

Systematic versus local search u Systematic search  Breadth-first, depth-first, IDDFS, A*, IDA*, etc  Keep one or more paths in memory and record which alternatives have been explored  The path from initial state to goal state is a solution u Local search  Do not maintain history information  Do not need the path to the solution 2

3 Example on next slide

4 neighbor = move any queen to other position in the same column For 4-queen, # of neighbors = ?

5 What is needed: --A neighborhood function --A “goodness” function needs to give a value to non-solution configurations too for 8 queens: # of number of pair-wise conflicts maximize value = minimize cost

Hill-climbing and systematic search Hill-climbing has a lot of freedom in deciding which node to expand next. But it is incomplete even for finite search spaces. –May re-visit the same state multiple times –Good for problems which have solutions Systematic search is complete (because its search tree keeps track of the parts of the space that have been visited). –Good for problems where solutions may not exist, Or the whole point is to show that there are no solutions –or the state-space is densely connected (making repeated exploration of states a big issue).

7  When the state-space landscape has local minima, any search that moves only in the greedy direction cannot be complete

Making Hill-Climbing Asymptotically Complete Random restart hill-climbing –Keep some bound B. When you made more than B moves, reset the search with a new random initial seed. Start again. “biased random walk”: Avoid being greedy when choosing the seed for next iteration –With probability p, choose the best child; but with probability (1-p) choose one of the children randomly Use simulated annealing –Similar to the previous idea—the probability p itself is increased asymptotically to one (so you are more likely to tolerate a non- greedy move in the beginning than towards the end) With random restart or the biased random walk strategies, we can solve very large problems million queen problems in under minutes!

Simulated annealing 9 u For p = 1- ɛ to 1  With probability p, choose the best child; but with probability (1-p) choose one of the children randomly u You are more likely to tolerate a non-greedy move in the beginning than towards the end Minimize cost

Local beam search 10 u Keep track of k states rather than just one u Start with k randomly generated states u At each iteration, all the successors of all k states are generated u If any one is a goal state, stop; else select the k best successors from the complete list and repeat Local beam search = running k random restarts in parallel? × Useful information is passed among the parallel search threads Biased random selection = stochastic beam search

Genetic algorithm u Stochastic beam search in which successor states are generated by combining two parent states rather than by modifying a single state u Motivated by evolutionary biology, i.e., sexual reproduction u A successor state is generated by combining two parent states u Start with k randomly generated states (population) u A state is represented as a string over a finite alphabet (often a string of 0s and 1s) u Evaluation function (fitness function). Higher values for better states. u Produce the next generation of states by selection, crossover, and mutation 11

12 Fitness function: number of non- attacking pairs of queens Normalized fitness function

Summary u Local search avoid the memory problem of systematic search u Can be improved by introducing randomness  Random restart  Biased random walk  Simulated annealing  (Stochastic) beam search  Genetic algorithm 13