Presentation is loading. Please wait.

Presentation is loading. Please wait.

Artificial Intelligence 15-381 Heuristic Search Methods Jaime Carbonell 30 August 2001 Today's Agenda Search Complexity Analysis Heuristics.

Similar presentations


Presentation on theme: "Artificial Intelligence 15-381 Heuristic Search Methods Jaime Carbonell 30 August 2001 Today's Agenda Search Complexity Analysis Heuristics."— Presentation transcript:

1 Artificial Intelligence 15-381 Heuristic Search Methods Jaime Carbonell jgc@cs.cmu.edu 30 August 2001 Today's Agenda Search Complexity Analysis Heuristics and evaluation functions Heuristic search methods Admissibility and A* search B* search (time permitting) Macrooperators in search (time permitting)

2 Complexity of Search Definitions Let depth d = length(min(s-Path(S 0, S G )))-1 Let branching-factor b = Ave(|Succ(S i )|) Let backward branching B = Ave(|Succ -1 (S i )|); usually b=bb, but not always Let C(,b,d) = max number of S i visited C(,b,d) = worst-case time complexity C(,b,d) >= worst-case space complexity

3 Complexity of Search Breadth-First Search Complexity C(BFS,b,d) =  i=0,d b i = O(b d ) C(BBFS,b,d) =  i=0,d B i = O(B d ) C(BiBFS,b,d) = 2  i=0,d/2 b i = O(b d/2 ), if b=B Suppose we have k evenly-spaced islands in s=Path(S 0, S G ), then: C(IBFS,b,d) = (k+1)  i=0,d/(k+1) b i = O(b d/(k+1) ) C(BiIBFS,b,d) = 2 (k+1)  i=0,d/(2k+2) b i = O(b d/(2k+2) )

4 Heuristics in AI Search Definition A Heuristic is an operationally-effective nugget of information on how to direct search in a problem space. Heuristics are only approximately correct.

5 Common Types of Heuristics "If-then" rules for state-transition selection Macro-operator formation [discussed later] Problem decomposition [e.g. hypothesizing islands on the search path] Estimation of distance between S curr and S G. (e.g. Manhattan, Euclidian, topological distance) Value function on each Succ(S curr ) cost(path(S 0, S curr )) + E[cost(path(S curr,S G ))] Utility: value(S) – cost(S)

6 Heuristic Search Value function: E(o-Path(S 0, S curr ), S curr, S G ) Since S 0 and S G are constant, we abbreviate E(S curr ) General Form: 1. Quit if done (with success or failure), else: 2. s-Queue:= F(Succ(S curr ),s-Queue) 3. S next := Argmax[E(s-Queue)] 4. Go to 1, with S curr := S next

7 Heuristic Search Steepest-Ascent Hill-Climbing F(Succ(S curr ), s-Queue) = Succ(S curr ) No history stored in s-Queue, hence: Space complexity = max(b) [=O(1) if b is bounded] Quintessential greedy search Max-Gradient Search "Informed" depth-first search S next := Argmax[E(Succ(S curr ))] But if Succ(S next ) is null, then backtrack Alternative: backtrack if E(S next )<E(S curr )

8 Beyond Greedy Search Best-First Search BestFS(S curr, S G, s-Queue) IF S curr = S G, return SUCCESS For s i in Succ(S curr ) Insertion-sort(, s-Queue) IF s-Queue = Null, return FAILURE ELSE return BestFS(FIRST(s-Queue), S G, TAIL(s-Queue))

9 Beyond Greedy Search Best-First Search (cont.) F(Succ(S curr) ), s-Queue) = Sort(Append(Succ(S curr ), Tail(s-Queue)),E(s i )) Full-breadth search "Ragged"fringe expansion Does BestFS guarantee optimality?

10 Beyond Greedy Search Beam Search Best-few BFS Beam-width parameter Uniform fringe expansion Does Beam Search guarantee optimality?

11 A* Search Cost Function Definitions Let g(S curr ) = actual cost to reach S curr from S 0 Let h(S curr )= estimated cost from S curr to S G Let f(S curr )= g(S curr ) + h(S curr )

12 A* Search Definitions Optimality Definition A solution is optimal if it conforms to the minimal-cost path between S 0 and S G. If operators cost is uniform, then the optimal solution = shortest path. Admissibility Definition A heuristic is admissible with respect to a search method if it guarantees finding the optimal solution first, even when its value is only an estimate.

13 A* Search Preliminaries Admissible Heuristics for BestFS "Always expand the node with min(g(S curr )) first." If Solution found, expand any S i in s-Queue where g(S i ) < g(S G ) Find solution any which way. Then Best FS(S i ) for all intermediate S i in solution as follows: 1. If g(S( 1 curr ) >= g(S G ) in previous, quit 2. Else if g(S( 1 G < g(S G ), Sol:=Sol 1, & redo (1).

14 A*: Better Admissible Heuristics Observations on admissible heuristics Admissible heuristics based only on look-back (e.g. on g(S)) can lead to massive inefficiency! Can we do better? Can we look forward (e.g. beyond g(S curr )) too? Yes, we can!

15 A*: Better Admissible Heuristics The A* Criterion If h(S curr ) always equals or underestimates the true remaining cost, then f(S curr ) is admissible with respect to Best-First Search. A* Search A* Search = BestFS with admissible f = g + h under the admissibility constraints above.

16 A* Optimality Proof Goal and Path Proofs Let S G be optimal goal state, and s-path (S 0, S G ) be the optimal solution. Consider an A* search tree rooted at S 0 with S 1 G on fringe. Must prove f(S G2 ) >= f(S G ) and g(path(S 0, S G )) is minimal (optimal). Text proves optimality by contradiction.

17 A* Optimality Proof Simpler Optimality Proof for A* Assume s-Queue sorted by f. Pick a sub-optimal S G 2 : g(S G 2 ) > g(S G ) Since h(S G 2 ) = h(S G ) = 0, f (S G 2 ) > f (S G ) If s-Queue is sorted by f, f(S G ) is selected before f(S G 2 )

18 B* Search Ideas Admissible heuristics for mono- and bi-polar search "Eliminates" horizon problem in game-trees [more later] Definitions Let Best(S) = Always optimistic eval fn. Let Worst(S) = Always pessimistic eval fn. Hence: Worst(S) < True-eval(S) < Best(S)

19 Basic B* Search B*(S) is defined as: If there is an S i in SUCC(S curr ) s.t. For all other S j in SUCC(S curr ), W(S i ) > B(S j ) Then select S i Else ProveBest (SUCC(S curr )) OR DisproveRest (SUCC(S curr ) Difficulties in B* Guaranteeing eternal pessimism in W(S) (eternal optimism is somewhat easier) Switching among ProveBest and DisproveRest Usually W(S) B(S j )

20 Macrooperators in Search Linear Macros Cashed sequence of instantiated operators: If: S 0 ---op i  S 1 ---op j  S 2 Then: S 0 –op i,j  S 2 Alternative notation: if: op j (op i (S 0 )) = S 2, Then: op i,j (S 0 ) = S 2 Macros can have any length, e.g. o i,j,k,l,m,n Key question: do linear macoros reduce search?

21 Macrooperators in Search Disjunctive Macros Iterative Macros op 1 op 2 op 5 op 4 op 6 op 3 op 7 Cond (s-Hist,S G )op i,j NO YES op k,l,m,n op o,p,q


Download ppt "Artificial Intelligence 15-381 Heuristic Search Methods Jaime Carbonell 30 August 2001 Today's Agenda Search Complexity Analysis Heuristics."

Similar presentations


Ads by Google