Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 CS 2710, ISSP 2610 Chapter 4, Part 1 Heuristic Search.

Similar presentations


Presentation on theme: "1 CS 2710, ISSP 2610 Chapter 4, Part 1 Heuristic Search."— Presentation transcript:

1 1 CS 2710, ISSP 2610 Chapter 4, Part 1 Heuristic Search

2 2 Take advantage of information about the problem

3 3 Best-First-Search More general use of the term than in the 1st edition of the text An evaluation function f is used to determine the ordering of nodes on the fringe (there are variations, depending on the search algorithm)

4 4 Best-First-Search In our framework: –treesearch or graphsearch, with nodes ordered on the fringe in increasing order by an evaluation function, f(n).

5 5 def treesearch (qfun,fringe): while len(fringe) > 0: cur = fringe[0] fringe = fringe[1:] if goalp(cur): return cur fringe = qfun(makeNodes(successors(cur)),fringe) return [] best-first search: qfun appends the lists together and sorts them in increasing order by f-value [In the more efficient version, a heap is used to maintain the queue in increasing order by f-value]

6 6 Heuristic Evaluation Function, h(n) There is a family of best-first search algorithms with different evaluation functions, f(n) A key component is the “heuristic evaluation function”, h(n)

7 7 h(n) Metric on states. Estimate of shortest distance to some goal. h : state  estimate of distance to goal h (goal) = 0 for all goal nodes

8 8 Greedy Best-First Search f (n) = h (n) Greedy best-first search may switch its strategy mid-search. For example, it may go depth-first for awhile, but then return to the shallow parts of the tree.

9 9 Greedy Example In the map domain, h(n) could be the straight line distance from a city to Bucharest Greedy search expands the node that currently appears to be closest to the goal

10 10 Go from Arad to Bucharest Oradea Zerind Arad Sibiu Timisoara Lugoj Mehadia Dobreta Rimnicu Vilcea Fagaras Craiova Pitesti Giurgiu Bucharest Urziceni Vaslui Iasi Neamt Hirsova Eforie 71 75 151 140 118 99 80 97 146 138 120 75 70 111 101 90 211 85 366 329 374 380 253176 0 193 160 244 241

11 11 Greedy Example Arad 366 Sibiu 253 Zerind 374 Timisoara 329 Arad 366 Oradea 380 Fagaras 178 Rimniciu 193 Bucharest 0Sibiu 253

12 12 Greedy Search Complete? –Nope Optimal? –Nope Time and Space? –It depends

13 13 Best of Both In an A* search we combine the best parts of Uniform-Cost and Best-First. We want to use the cost so-far to allow optimality and completeness, while at the same time using a heuristic to draw us toward a goal.

14 14 A*: f(n) = g(n) + h(n) g(n): actual cost from start to n h(n): estimated distance from n.state to a goal Even if h continuously returns good values for states along a path, if no goal is reached, g will eventually dominate h and force backtracking to more shallow nodes.

15 15 Arad 646 Oradea526 Fagaras 417 Rimniciu 413 Arad 366 Sibiu 393 Zerind 449 Timisoara 447 Bucharest 450 Sibiu 591Sibiu 553Pitesti 415Craiova 526 Bucharest 418 Craiova 615 Rimniciu 607

16 16 A*: f(n) = g(n) + h(n) If h(n) does not overestimate the real cost then the search is optimal. An h function that does not overestimate is called admissible

17 17 A* with an admissible heuristic is optimal Let: G2 be a suboptimal goal on fringe and GO be an optimal goal, g(GO) = C* C* < g(G2) (since G2 is suboptimal) h(G2) = 0 (since G2 is a goal) So f(G2) = g(G2) and C* < f(G2)

18 18 Proof continued Let n be a node on the fringe that is on an optimal solution path Since h is admissible: f(n) = g(n) + h(n) <= C* For G2 to be the first goal found, it would need to be first on the fringe But f(n) <= C* < f(G2)

19 19 Proof continued Is it possible that g2 is the first node on the fringe but there is no such node n on the fringe? No: by virtue of how it is generated, the search tree is a connected graph, and start is an ancestor of both n and g2. Let p be the first node on the path from start to g2 such gval(p) > C* (this could be g2). The ancestors of n all have f-vals <= C* (since h is admissible). So, it isn’t possible for p to be ordered before those ancestors on the fringe.

20 20 A* with an admissible heuristic is complete If it is guaranteed to find the optimal solution, it is guaranteed to find a solution

21 21 A* and Memory Does A* solve the memory problems with BFS and Uniform Cost? –A* has same or smaller memory requirement than BFS or Uniform Cost –How is A* related to BFS And UC? –BFS = A* with edgecost(n) = 1, h(n) = 0 –UC = A* with h(n) = 0 –But it might not be sufficiently better to make A* practically feasible

22 22 Note Placement of goalp test (and return if successful) in algorithm is critical. Optimality guarantee lost if nodes are tested when they are generated –The only specification successor function must meet is that it return all legal successors of its input

23 23 Note for A* Assuming f-costs are nondecreasing along any path: –Can draw contours in the state space –Inside a contour labeled 300 are all nodes with f(n) less than or equal to 300 –A* fans out from start, expanding nodes in bands of increasing f-cost. –h(n) = 0: contours are round –With better heuristics, the bands narrow and stretch toward the goal node

24 24 EG Admissible Heuristics The 8-puzzle (a small version of the 15 puzzle). Sample heuristics Number of misplaced tiles Manhattan distance

25 25 8 Puzzle Example H1(S) = 7 H2(S) = 2+3+3+2+4+2+0+2 = 18 Which heuristic is better?

26 26 Informedness Let h1 and h2 be admissible heuristics. If h1(n) <= h2(n) for all n, then h2 is more informed than h1 and Fewer nodes will be expanded, on average, with h2 than with h1 The larger values the better (without going over)

27 27 A* is often not feasible Still a memory hog What can we do? Use an iterative deepening style strategy!

28 28 IDA* Like iterative deepening, but search to f-contours rather than fixed depths. Each iteration expands all nodes within a particular f-value range.

29 29 Def fLimSearch(fringe,fLim): nextF = INFINITY while fringe: cur = fringe[0] fringe = fringe[1:] curF = cur.gval + h(cur) if curF <= fLim: if goalp(cur): return(cur,curF) succNodes = makeNodes(cur,successors(cur)) for s in succNodes: fVal = s.gval + h(s) if fVal > fLim and fVal < nextF: nextF = fVal fringe = succNodes + fringe return ([],nextF)

30 30 def IDAstar(start): result = [] startNode = Node(start) fLim = h(startNode) while not result: result, FLim = fLimSearch([startNode],fLim)

31 31 IDA* Worst case, space is O(bd) Optimal, if h is admissible The number of iterations grows as the number of possible f values grow. Let x = average # nodes with the same f-value. The lower x is, the fewer new nodes, on average, are expanded on each iteration.

32 32 General Notes before Continuing

33 33 Search strategies differ along many dimensions Basic strategy: depth-first, breadth- first, least-actual-cost (g(n)), best first (h(n)), or a mixture? Is the algorithm iterative, starting by looking at a small part of the state space and then successively looking at larger parts of it? (e.g., iterative deepening and IDA*)

34 34 Search strategies differ along many dimensions Does it pay attention to cycles? (i.e., our treesearch vs. graphsearch) Can it backtrack? Or are parts of the search tree/graph irrevocably pruned? (e.g., beam search) Does it only look ahead toward goal, or does it also consider how far it has come so far?

35 35 A note on optimality It might be desirable to be greedy (e.g., greedy best-first vs. A*) Simon: people are often “satisficers”: often, they stop as soon as they find a satisfactory solution Consider choosing a line at the grocery store, or finding a parking space

36 36 Another note on optimality Distinguish between correctness of h(n) and the optimality of the search. An optimal search may use an incorrect h(n)! In fact, entirely correct h(n) functions are rare (otherwise, why perform heuristic search?)

37 37 What do we hope to gain by using h(n)? Now that you have seen a few types of best-first search, we can ask: what do we hope to gain by using a heuristic evaluation function? Ans: reduce the number of nodes explored before finding a solution


Download ppt "1 CS 2710, ISSP 2610 Chapter 4, Part 1 Heuristic Search."

Similar presentations


Ads by Google