Computer Science CPSC 322 Lecture 6 Analysis of A*, Search Refinements (Ch 3.7.1 - 3.7.4) Slide 1.

Slides:



Advertisements
Similar presentations
Review: Search problem formulation
Advertisements

Informed search strategies
An Introduction to Artificial Intelligence
Heuristic Search Jim Little UBC CS 322 – Search 4 September 17, 2014 Textbook §
Solving Problem by Searching
Problem Solving Agents A problem solving agent is one which decides what actions and states to consider in completing a goal Examples: Finding the shortest.
Solving Problems by Searching Currently at Chapter 3 in the book Will finish today/Monday, Chapter 4 next.
CPSC 322, Lecture 5Slide 1 Uninformed Search Computer Science cpsc322, Lecture 5 (Textbook Chpt 3.4) January, 14, 2009.
Uninformed Search Jim Little UBC CS 322 – Search 2 September 12, 2014
Search in AI.
Slide 1 Heuristic Search: BestFS and A * Jim Little UBC CS 322 – Search 5 September 19, 2014 Textbook § 3.6.
Slide 1 Search: Advanced Topics Jim Little UBC CS 322 – Search 5 September 22, 2014 Textbook § 3.6.
CPSC 322, Lecture 9Slide 1 Search: Advanced Topics Computer Science cpsc322, Lecture 9 (Textbook Chpt 3.6) January, 23, 2009.
CPSC 322 Introduction to Artificial Intelligence October 27, 2004.
CPSC 322, Lecture 9Slide 1 Search: Advanced Topics Computer Science cpsc322, Lecture 9 (Textbook Chpt 3.6) January, 22, 2010.
Review: Search problem formulation
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.3 [P]: Searching Fall 2009 Marco Valtorta.
CPSC 322, Lecture 8Slide 1 Heuristic Search: BestFS and A * Computer Science cpsc322, Lecture 8 (Textbook Chpt 3.5) January, 21, 2009.
CSC344: AI for Games Lecture 4: Informed search
Uninformed Search (cont.)
Multiple Path Pruning, Iterative Deepening CPSC 322 – Search 7 Textbook § January 26, 2011.
Review: Search problem formulation Initial state Actions Transition model Goal state (or goal test) Path cost What is the optimal solution? What is the.
Computer Science CPSC 322 Lecture Heuristic Search (Ch: 3.6, 3.6.1) Slide 1.
Computer Science CPSC 322 Lecture A* and Search Refinements (Ch 3.6.1, 3.7.1, 3.7.2) Slide 1.
Lecture 2 Search (textbook Ch: 3)
Informed search algorithms Chapter 4. Outline Best-first search Greedy best-first search A * search Heuristics.
Computer Science CPSC 322 Lecture 9 (Ch , 3.7.6) Slide 1.
Informed search strategies Idea: give the algorithm “hints” about the desirability of different states – Use an evaluation function to rank nodes and select.
Informed searching. Informed search Blind search algorithms do not consider any information about the states and the goals Often there is extra knowledge.
Search with Costs and Heuristic Search 1 CPSC 322 – Search 3 January 17, 2011 Textbook §3.5.3, Taught by: Mike Chiang.
Review: Tree search Initialize the frontier using the starting state While the frontier is not empty – Choose a frontier node to expand according to search.
Lecture 3: Uninformed Search
Heuristic Search: A* 1 CPSC 322 – Search 4 January 19, 2011 Textbook §3.6 Taught by: Vasanth.
Computer Science CPSC 322 Lecture 6 Iterative Deepening and Search with Costs (Ch: 3.7.3, 3.5.3)
CPSC 322, Lecture 6Slide 1 Uniformed Search (cont.) Computer Science cpsc322, Lecture 6 (Textbook finish 3.5) Sept, 17, 2012.
A* optimality proof, cycle checking CPSC 322 – Search 5 Textbook § 3.6 and January 21, 2011 Taught by Mike Chiang.
Chapter 3.5 and 3.6 Heuristic Search Continued. Review:Learning Objectives Heuristic search strategies –Best-first search –A* algorithm Heuristic functions.
CPSC 322, Lecture 5Slide 1 Uninformed Search Computer Science cpsc322, Lecture 5 (Textbook Chpt 3.5) Sept, 13, 2013.
CPSC 322, Lecture 10Slide 1 Finish Search Computer Science cpsc322, Lecture 10 (Textbook Chpt 3.6) Sep, 26, 2010.
CPSC 322, Lecture 7Slide 1 Heuristic Search Computer Science cpsc322, Lecture 7 (Textbook Chpt 3.6) Sept, 20, 2013.
Chapter 3 Solving problems by searching. Search We will consider the problem of designing goal-based agents in observable, deterministic, discrete, known.
Artificial Intelligence Solving problems by searching.
Computer Science CPSC 322 Lecture 5 Heuristic Search and A* (Ch: 3.6, 3.6.1) Slide 1.
Lecture 3: Uninformed Search
Computer Science cpsc322, Lecture 7
Review: Tree search Initialize the frontier using the starting state
Uniformed Search (cont.) Computer Science cpsc322, Lecture 6
Search: Advanced Topics Computer Science cpsc322, Lecture 9
Branch & Bound, Summary of Advanced Topics
Last time: Problem-Solving
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Heuristic Search Introduction to Artificial Intelligence
CSCE 580 Artificial Intelligence Ch.3 [P]: Searching
Search: Advanced Topics Computer Science cpsc322, Lecture 9
Uniformed Search (cont.) Computer Science cpsc322, Lecture 6
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Discussion on Greedy Search and A*
Discussion on Greedy Search and A*
EA C461 – Artificial Intelligence
Informed search algorithms
Informed search algorithms
Search: Advanced Topics Computer Science cpsc322, Lecture 9
Iterative Deepening CPSC 322 – Search 6 Textbook § 3.7.3
CSE 473 University of Washington
More advanced aspects of search
Informed Search Idea: be smart about what paths to try.
Iterative Deepening and Branch & Bound
CS 416 Artificial Intelligence
CMSC 471 Fall 2011 Class #4 Tue 9/13/11 Uninformed Search
Informed Search Idea: be smart about what paths to try.
Presentation transcript:

Computer Science CPSC 322 Lecture 6 Analysis of A*, Search Refinements (Ch ) Slide 1

Slide 2 Lecture Overview Recap of previous lecture Analysis of A* Branch-and-Bound Cycle checking, multiple path pruning

How to Make Search More Informed? Def.: A search heuristic h(n) is an estimate of the cost of the optimal (cheapest) path from node n to a goal node. Estimate: h(n1) Estimate: h(n2) Estimate: h(n3) n3 n2 n1 Slide 3 h can be extended to paths: h(  n 0,…,n k  )=h(n k ) h(n) should leverage readily obtainable information (easy to compute) about a node.

Always choose the path on the frontier with the smallest h value. BestFS treats the frontier as a priority queue ordered by h. Can get to the goal pretty fast if it has a good h but… It is not complete, nor optimal still has time and space worst-case complexity of O(b m ) Best First Search (BestFS) 4

Apply basic properties of search algorithms: - completeness, optimality, time and space complexity CompleteOptimalTimeSpace DFSN (Y if no cycles) NO(b m )O(mb) BFSYYO(b m ) IDSYYO(b m )O(mb) LCFS (when arc costs available) Y Costs > Y Costs >=0 O(b m ) Best First (when h available) NNO(b m ) 5 Learning Goal for Search uninformed Uninformed but using arc costInformed (goal directed)

A * search takes into account both the cost of the path to a node c(p) the heuristic value of that path h(p). Let f(p) = c(p) + h(p). f(p) is an estimate of the cost of a path from the start to a goal via p. A* Search c(p) h(p) f(p ) A* always chooses the path on the frontier with the lowest estimated distance from the start to a goal node constrained to go via that path. Compare A* and LCFS on the Vancouver graph Slide 6

F-value of ubc  kd  jb? 10 Computing f-valeues Slide 7

A* is complete (finds a solution, if one exists) and optimal (finds the optimal path to a goal) if the branching factor is finite arc costs are > 0 h(n) is admissible Optimality of A* Slide 8

Admissibility of a heuristic 9 Def.: Let c(n) denote the cost of the optimal path from node n to any goal node. A search heuristic h(n) is called admissible if h(n) ≤ c(n) for all nodes n, i.e. if for all nodes it is an underestimate of the cost to any goal. Example: is the straight- line distance admissible? - The shortest distance between two points is a line. YES

Admissibility of a heuristic 10 Def.: Let c(n) denote the cost of the optimal path from node n to any goal node. A search heuristic h(n) is called admissible if h(n) ≤ c(n) for all nodes n, i.e. if for all nodes it is an underestimate of the cost to any goal. example: the goal is Urzizeni (red box), but all we know is the straight- line distances to Bucharest (green box) NO Possible h(n) = sld(n, Bucharest) + cost(Bucharest, Urzineni) Admissible? Cost of going from Vastul to Urzineni is shorter than this estimate

A* is complete (finds a solution, if one exists) and optimal (finds the optimal path to a goal) if the branching factor is finite arc costs are > 0 h(n) is admissible Optimality of A* Slide 11

S G p* p Proof Let - p be a subpath of an optimal path p*. - c( ) be the actual path cost of a path (> 0) - f( ) be the f-values of a path - h() be the heuristic value of a path (≥ 0) We can show that If h() is admissible, f(p) ≤ f(p*) 12 p* Proof 1. f(p*) = c(p*) + h(p*) = c(p*) Because h is admissible, thus cannot be > 0 at the goal, must be h(p*) = 0 2. f(p) = c(p) + h(p) ≤ c(p*) = f(p*) Because h(p) does not overestimate the cost of getting from p to the goal. Thus f(p) ≤ f(p*) For every subpath of an optimal path p*

Slide 13 Lecture Overview Recap of previous lecture Analysis of A* Branch-and-Bound Cycle checking, multiple path pruning

Let p* be the optimal solution path, with cost c*. Let p’ be a suboptimal solution path. That is c(p’) > c*. We are going to show that any sub-path p" of p* on the frontier will be expanded before p’ Therefore, A* will find p* before p’ Why A* is optimal p’ p* p” 14

Let p* be the optimal solution path, with cost c*. Let p’ be a suboptimal solution path. That is c(p’) > c*. Let p” be a sub-path of p* on the frontier. Why A* is optimal we know that p’ p* p” A. f* < f(p’) 15 B. f* ≤ f(p’) C. f* > f(p’) D. None of the above

we know that because at a goal node And because h is admissible (see previous proof) Let p* be the optimal solution path, with cost c*. Let p’ be a suboptimal solution path. That is c(p’) > c*. Let p” be a sub-path of p* on the frontier. Why A* is optimal f(p’’) ≤ f(p*) p’ p* p” f (goal) = c(goal) Thus f* < f(p’) f(p”) < f(p’) Any sup-path of the optimal solution path will be expanded before p’ 16

Run A* on this example to see how A* starts off going down the suboptimal path (through N5) but then recovers and never expands it, because there are always subpaths of the optimal path through N2 on the frontier Slide 17 17

It does not get caught in cycles Let f* be the cost of the (an) optimal solution path p* (unknown but finite if there exists a solution) Each sub-path p of p* will be expanded before p* -See previous proof With positive (and > З) arc costs, the cost of any other path p on the frontier would eventually exceed f* This happens at depth no greater than (f min / c min ), where c min is the minimal arc cost in the search graph See how it works on the “misleading heuristic” problem in AI space: Why is A* complete 18

19 A* does not get caught into the cycle because f(n) of sub paths in the cycle eventually (at depth N6->N7->N8) Why is A* complete

Analysis of A* If the heuristic is completely uninformative and the edge costs are all the same, A* is equivalent to…. A.BFS B.LCFS C.DFS D.None of the Above 20

Analysis of A* If the heuristic is completely uninformative and the edge costs are all the same, A* is equivalent to…. A.BFS B.LCFS C.DFS D.None of the Above 21

Time Space Complexity of A * Time complexity is O(b m ) the heuristic could be completely uninformative and the edge costs could all be the same, meaning that A * does the same thing as BFS Space complexity is O(b m ) like BFS, A * maintains a frontier which grows with the size of the tree 22

Effect of Search Heuristic A search heuristic that is a better approximation on the actual cost reduces the number of nodes expanded by A* Example: 8puzzle: (1) tiles can move anywhere (h 1 : number of tiles that are out of place) (2) tiles can move to any adjacent square (h 2 : sum of number of squares that separate each tile from its correct position) average number of paths expanded: (d = depth of the solution) d=12IDS = 3,644,035 paths A * (h 1 ) = 227 paths A * (h 2 ) = 73 paths d=24 IDS = too many paths A * (h 1 ) = 39,135 paths A * (h 2 ) = 1,641 paths h 2 dominates h 1 because h 2 (n) > h 1 (n) for every n 23 More on this in assignment 1

Apply basic properties of search algorithms: - completeness, optimality, time and space complexity CompleteOptimalTimeSpace DFSN (Y if no cycles) NO(b m )O(mb) BFSYYO(b m ) IDSYYO(b m )O(mb) LCFS (when arc costs available) Y Costs > Y Costs >=0 O(b m ) Best First (when h available) NNO(b m ) A* (when arc costs > and h admissible) 24 Learning Goal for Search

Apply basic properties of search algorithms: - completeness, optimality, time and space complexity CompleteOptimalTimeSpace DFSN (Y if no cycles) NO(b m )O(mb) BFSYYO(b m ) IDSYYO(b m )O(mb) LCFS (when arc costs available) Y Costs > Y Costs >=0 O(b m ) Best First (when h available) NNO(b m ) A* (when arc costs > and h admissible) YYO(b m ) 25 Learning Goal for Search

Slide 26 Lecture Overview Recap of previous lecture Analysis of A* Branch-and-Bound Cycle checking, multiple path pruning

Branch-and-Bound Search What does allow A* to do better than the other search algorithms we have seen? Arc cost and h(h) combined in f(n) What is the biggest problem with A*? space Possible Solution: Combine DFS with f 27

Branch-and-Bound Search One way to combine DFS with heuristic guidance Follows exactly the same search path as depth-first search But to ensure optimality, it does not stop at the first solution found It continues, after recording upper bound on solution cost upper bound: UB = cost of the best solution found so far Initialized to  or any overestimate of optimal solution cost When a path p is selected for expansion: Compute lower bound LB(p) = f(p) = cost(p) + h(p) - If LB(p)  UB, remove p from frontier without expanding it (pruning) - Else expand p, adding all of its neighbors to the frontier 28

Example Arc cost = 1 h(n) = 0 for every n UB = ∞ Solution! UB = ? Before expanding a path p, check its f value f(p): Expand only if f(p) < UB 29 JUST T O SIMPLY THE EXAMPLE

Example Arc cost = 1 h(n) = 0 for every n UB = 5 f= 5 Prune! 30 Before expanding a path p, check its f value f(p): Expand only if f(p) < UB

Example Before expanding a path p, check its f value f(p): Expand only if f(p) < UB Arc cost = 1 h(n) = 0 for every n UB = 5 f = 5 Prune! f = 5 Prune! Solution! UB =? 31

Example Before expanding a path p, check its f value f(p): Expand only if f(p) < UB Arc cost = 1 h(n) = 0 for every n UB = 3 f = 3 Prune! 32

Example Before expanding a path p, check its f value f(p): Expand only if f(p) < UB Arc cost = 1 h(n) = 0 for every n UB = 3 f = 3 Prune! f = 3 Prune! 33

Branch-and-Bound Analysis Complete ? Optimal: Space complexity: Time complexity: ……. 34

Is Branch-and-Bound optimal? D. Only if there are no cycles A. YES, with no further conditions C. Only if h(n) is admissible B. NO Branch-and-Bound Analysis 35

Is Branch-and-Bound optimal? D. Only if there are no cycles A. YES, with no further conditions C. Only if h(n) is admissible. Otherwise, when checking LB(p)  UB, if the answer is yes but h(p) is an overestimate of the actual cost of p, we remove a possibly optimal solution B. NO Branch-and-Bound Analysis 36

Branch-and-Bound Analysis Complete ? (..even when there are cycles) B. NO A. YES C. It depends 37

Branch-and-Bound Analysis Complete ? (..even when there are cycles) IT DEPENDS on whether we can initialize UB to a finite value, i.e. we have a reliable overestimate of the solution cost. If we don`t, we need to use ∞, and BBA can be caught in a cycle 38

Branch-and-Bound Analysis Complete ? Same as DFS: can’t handle cycles/infinite graphs. But complete if initialized with some finite U Optimal: YES, if h(n) is admissible Space complexity: Branch & Bound has the same space complexity as DFS this is a big improvement over A*! Time complexity: O(b m ). 39

CompleteOptimalTimeSpace DFSNNO(b m )O(mb) BFSYYO(b m ) IDSYYO(b m )O(mb) LCFS (when arc costs available) Y Costs > Y Costs >=0 O(b m ) Best First (when h available) NNO(b m ) A* (when arc costs > and h admissible ) YY O(b m ) Optimally Efficient O(b m ) Branch-and-BoundN (Y with finite initial bound) Y If h admissible O(b m ) Search Methods so Far uninformed Uninformed but using arc costInformed (goal directed)

Slide 41 Lecture Overview Recap of previous lecture Analysis of A* Branch-and-Bound Cycle checking, multiple path pruning

Clarification: state space graph vs search tree kc b z h a d f State space graph represents the states in a search problem, and how they are connected by the available operators Search Tree: Shows how the search space is traversed by a given search algorithm: explicitly “unfolds” the paths that are expanded. k c b z h a d f If there are no cycles or multiple paths, the two look the same 42

Clarification: state space graph vs search tree kc b z h a d f State space graph represents the states in a search given problem, and how they are connected by the available operators Search Tree: Shows how the search space is traversed by a given search algorithm: explicitly “unfolds” the paths that are expanded. k c b z h a d f If there are no cycles or multiple paths, the two look the same can use numbers on the nodes to show the actual progression 43

Clarification: state space graph vs search tree kc b z h a d f State space graph k cb z k ad c If there are cycles or multiple paths, the two look very different Search Tree: (up to depth 3) h b k cbf 44

Size of state space vs. search tree If there are cycles or multiple paths, the two look very different A B C D A B C B CCC DD E.g. state space with d states and 2 actions from each state to next With d + 1 states, search tree has depth d 2 d possible paths through the search space => exponentially larger search tree! With cycles or multiple parents, the search tree can be exponential in the state space 45

Cycle Checking and Multiple Path Pruning Cycle checking: good when we want to avoid infinite loops, but also want to find more than one solution, if they exist Multiple path pruning: good when we only care about finding one solution Subsumes cycle checking 46

What is the computational cost of cycle checking? Cycle Checking You can prune a path that ends in a node already on the path. This pruning cannot remove an optimal solution => cycle check Good when we want to avoid infinite loops, but also want to find more than one solution, if they exist 47

Cycle Checking See how DFS and BFS behave on Cyclic Graph Example in Aispace, when Search Options-> Pruning -> Loop detection is selected Set N1 to be a normal node so that there is only one start node. Check, for each algorithm, what happens during the first expansion from node 3 to node 2 48

Depth First Search Since DFS looks at one path at a time, when a node is encountered for the second time (e.g. Node 2 while expanding N0, N2, N5, N3) it is guaranteed to be part of a cycle. 49

Breadth First Search Since BFS keeps multiple subpaths going, when a node is encountered for the second time, it could be as part of expanding a different path (e.g. Node 2 while expanding N0-.N3). Not necessarily a cycle. 50

Breadth First Search The cycle for BFS happens when N2 is encountered for the second time while expanding the path N0-.N2- N5-N3. 51

Computational Cost of Cycle Checking? D. None of the above B. Linear time in the path length: before adding a new node to the currently selected path, check that the node is not already part of the path A. As low as Constant time (not dependent on path length) : e.g., set a bit to 1 when a node is selected for expansion, and never expand a node with a bit set to 1. C. It depends on the algorithm 52

Computational Cost of Cycle Checking? D. None of the above B. Linear time in the path length: before adding a new node to the currently selected path, check that the node is not already part of the path A. As low as constant time: (e.g., set a bit to 1 when a node is selected for expansion, and never expand a node with a bit set to 1) C. It depends on the algorithm 53

What is the computational cost of cycle checking? Cycle Checking You can prune a path that ends in a node already on the path. This pruning cannot remove an optimal solution => cycle check Using depth-first methods, with the graph explicitly stored, this can be done in as low as constant time (i.e., not dependent on path length) -Only one path being explored at a time Other methods: cost is linear in path length -Must check each node in the current path, since a node could have been expanded as part of a different path on the frontier 54

If we only want one path to the solution Can prune path to a node n that has already been reached via a previous path -Subsumes cycle check Multiple Path Pruning n 55

Can see how it works by –Running BFS on the Cyclic Graph Example in CISPACE –See how it handles the multiple paths from N0 to N2 –You can erase start node N1 to simplify things Multiple Path Pruning n 56

Example with BSF 57 Handling the multiple paths from N0 to N2 (, )

58 Multiple-Path Pruning & Optimal Solutions Problem: what if a subsequent path p 2 to n is better (shorter or less costly) than the first path p 1 to a node n, and we want an optimal solution ? Can remove all paths from the frontier that use the longer path: these can’t be optimal. Can change the initial segment of the paths on the frontier to use the shorter or Could prove that this problem can’t happen for a given algorithm, i.e. the first path to any node is the best p2 p1 n

“ Whenever search algorithm X expands the first path p ending in node n, this is the lowest-cost path from the start node to n (if all costs ≥ 0)” This is true for D. None of the above A. Lowest Cost Search First C. Both of the above B. A* Can prove that search algorithm X always find the optimal path to any node n in the search space first? 59

“ Whenever search algorithm X expands the first path p ending in node n, this is the lowest-cost path from the start node to n (if all costs ≥ 0)” This is true for D. None of the above A. Lowest Cost Search First C. Both of the above B. A* Can prove that search algorithm X always find the optimal path to any node n in the search space first? 60

“ Whenever search algorithm X expands the first path p ending in node n, this is the lowest-cost path from the start node to n (if all costs ≥ 0)” This is true for D. None of the above A. Lowest Cost Search First C. Both of the above B. A* Can prove that search algorithm X always find the optimal path to any node n in the search space first? 61

Only LCSF, which always expand the path with the lowest cost by construction Below is the counter-example for A*: it expands the upper path to n first, so if we prune the second path at the bottom, we miss the optimal solution Special conditions on the heuristic can recover the guarantee of LCFS for A*: the monotone restriction (See P&M text, Section 3.7.2) 62

63

Summary: pruning Sometimes we don’t want multiple path pruning, just cycle checking - Because we want multiple solutions (including non-optimal ones) Search tree can be exponentially larger than search space - So pruning is often important In DFS-type search algorithms - We can do cheap cycle checks: constant time (e.g. not dependent on path length) BFS-type search algorithms are memory-heavy already - We can store the path to each expanded node and do multiple path pruning 64

Apply basic properties of search algorithms: completeness, optimality time and space complexity Select the most appropriate search algorithms for specific problems. Depth-First Search vs. Bredth-First Search vs. Iterative Deepening vs. Least-Cost-First Search, Best First Search, A*, Branch and Bound Define/read/write/trace/debug different search algorithms Construct heuristic functions and discuss their admissibility for specific search problems Prove optimality of A* Pruning cycles and multiple paths Learning Goals for Search (up to today) Slide 65

To Do for Next Class Read Chp (Intro to Constraint Satisfaction Problems) Keep working on assignment-1 ! Do Practice Exercise 3E Slide 66