Download presentation
Presentation is loading. Please wait.
1
Trees
2
Trees Definition: A tree is a connected undirected graph with no simple circuits. Since a tree cannot have a simple circuit, a tree cannot contain multiple edges or loops. Therefore, any tree must be a simple graph. Theorem: An undirected graph is a tree if and only if there is a unique simple path between any of its vertices.
3
Trees Example: Are the following graphs trees? Yes. No. No. Yes.
4
Trees Definition: An undirected graph that does not contain simple circuits and is not necessarily connected is called a forest. In general, we use trees to represent hierarchical structures. We often designate a particular vertex of a tree as the root. Since there is a unique path from the root to each vertex of the graph, we direct each edge away from the root. Thus, a tree together with its root produces a directed graph called a rooted tree.
5
Tree Terminology If v is a vertex in a rooted tree other than the root, the parent of v is the unique vertex u such that there is a directed edge from u to v. When u is the parent of v, v is called the child of u. Vertices with the same parent are called siblings. The ancestors of a vertex other than the root are the vertices in the path from the root to this vertex, excluding the vertex itself and including the root.
6
Tree Terminology The descendants of a vertex v are those vertices that have v as an ancestor. A vertex of a tree is called a leaf if it has no children. Vertices that have children are called internal vertices. If a is a vertex in a tree, then the subtree with a as its root is the subgraph of the tree consisting of a and its descendants and all edges incident to these descendants.
7
Tree Terminology The level of a vertex v in a rooted tree is the length of the unique path from the root to this vertex. The level of the root is defined to be zero. The height of a rooted tree is the maximum of the levels of vertices.
8
Trees Example I: Family tree James Christine Bob Frank Joyce Petra
9
Trees Example II: Linux file system / usr bin temp bin spool ls
10
Trees Example III: Arithmetic expressions
* + - y z x y This tree represents the expression (y + z)*(x - y).
11
Properties of Trees Theorem: A tree with n vertices has n-1 edges.
(Can be proven using induction.) Basis Step: Let n=1. A tree with 1 vertex has no edges, so basis step is true Inductive step: Suppose every tree with k vertices has k-1 edges. Let T be a tree with k+1 vertices, and let v be a leaf of T (which must exist because the tree is finite). Let w be the parent of v.
12
Properties of Trees Removing the vertex v and the edge connecting w to v produces a tree T’. T’ is a tree with k vertices and is a tree because it is still connected and has no simple circuits. By the inductive hypothesis, T’ has k-1 edges. It follows that T has k edges, because it has one more edge than T’ (the edge connecting w and v). This completes the inductive step.
13
Properties of Trees Definition: A forest is a graph that has no cycles and that is not necessarily connected. Example: This is a forest with three connected components
14
Trees Definition: A rooted tree is called an m-ary tree if every internal vertex has no more than m children. The tree is called a full m-ary tree if every internal vertex has exactly m children. An m-ary tree with m = 2 is called a binary tree. Theorem: A full m-ary tree with k internal vertices contains n = mk + 1 vertices.
15
Trees Theorem: There are at most mh leaves in an m-ary tree of height h. (proof is by induction) Corollary: If an m-ary tree of height h has l leaves, then h≥logm. If the m-ary tree is full and balanced, then h=logm. (Recall this is the ceiling function.) (First part follows from theorem; second part uses definition of balanced tree)
16
Binary Search Trees If we want to perform a large number of searches in a particular list of items, it can be worthwhile to arrange these items in a binary search tree to facilitate the subsequent searches.
17
Binary Search Trees A binary search tree is a binary tree in which each child of a vertex is designated as a right or left child, and each vertex is labeled with a key, which is one of the items. When we construct the tree, vertices are assigned keys so that the key of a vertex is both larger than the keys of all vertices in its left subtree and smaller than the keys of all vertices in its right subtree.
18
Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math
19
Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math computer
20
Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math power computer
21
Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math power computer north
22
Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math power computer north zoo
23
Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math power computer dentist north zoo
24
Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math power computer book dentist north zoo
25
Binary Search Trees To perform a search in such a tree for an item x, we can start at the root and compare its key to x. If x is less than the key, we proceed to the left child of the current vertex, and if x is greater than the key, we proceed to the right one. This procedure is repeated until we either found the item we were looking for, or we cannot proceed any further. In a balanced tree representing a list of n items, search can be performed with a maximum of log(n + 1) steps (compare with binary search).
26
Locating or adding an item to a Binary Search Tree
procedure insertion(T: binary search tree, x:item) v := root of T {a vertex not present in T has the value null} while v != null and label(v) != x if x < label(v) then if left child of v != null then v := left child of v else add new vertex as a left child of v and set v := null else if right child of v != null then v := right child of v else add new vertex as a right child of v and set v := null if root of T = null then add a vertex v to the tree and label it with x else if v is null or label(v) != x then label new vertex with x and let v be this new vertex return v {v = location of x}
27
Decision Trees Definition: A decision tree represents a decision-making process. Each possible “decision point” or situation is represented by a node. Each possible choice that could be made at that decision point is represented by an edge to a child node. In the extended decision trees used in decision analysis, we also include nodes that represent random events and their outcomes.
28
Counterfeit Coin Problem
Imagine you have 8 coins, one of which is a lighter counterfeit, and a free-beam balance. No scale of weight markings is required for this problem! How many weighings are needed to guarantee that the counterfeit coin will be found? ?
29
As a Decision-Tree Problem
In each situation, we pick two disjoint and equal-size subsets of coins to put on the scale. A given sequence of weighings thus yields a decision tree with branching factor 3. The balance then “decides” whether to tip left, tip right, or stay balanced.
30
Applying the Tree Height Theorem
The decision tree must have at least 8 leaf nodes, since there are 8 possible outcomes. In terms of which coin is the counterfeit one. Recall the tree-height theorem, h≥logm. Thus the decision tree must have height h ≥ log38 = 1.893… = 2. Let’s see if we solve the problem with only 2 weighings…
31
General Solution Strategy
The problem is an example of searching for 1 unique particular item, from among a list of n otherwise identical items. Somewhat analogous to the adage of “searching for a needle in haystack.” Armed with our balance, we can attack the problem using a divide-and-conquer strategy, like what’s done in binary search. We want to narrow down the set of possible locations where the desired item (coin) could be found down from n to just 1, in a logarithmic fashion.
32
General Solution Strategy
Each weighing has 3 possible outcomes. Thus, we should use it to partition the search space into 3 pieces that are as close to equal-sized as possible. This strategy will lead to the minimum possible worst-case number of weighings required.
33
General Balance Strategy
On each step, put n/3 of the n coins to be searched on each side of the scale. If the scale tips to the left, then: The lightweight fake is in the right set of n/3 ≈ n/3 coins. If the scale tips to the right, then: The lightweight fake is in the left set of n/3 ≈ n/3 coins. If the scale stays balanced, then: The fake is in the remaining set of n − 2n/3 ≈ n/3 coins that were not weighed! Except if n mod 3 = 1 then we can do a little better by weighing n/3 of the coins on each side. You can prove that this strategy always leads to a balanced 3-ary tree.
34
Coin Balancing Decision Tree
Here’s what the tree looks like in our case: 123 vs 456 left: 123 balanced: 78 right: 456 1 vs. 2 4 vs. 5 7 vs. 8 L:1 R:2 B:3 L:4 R:5 B:6 L:7 R:8
35
Tree Traversal Unlike linked lists, one-dimensional arrays and other linear data structures, which are canonically traversed in linear order, trees may be traversed in multiple ways. They may be traversed in depth-first or breadth-first order. There are three common ways to traverse them in depth-first order: pre-order, in-order and post-order.
36
Tree Traversal Pre-Order Check if current node is empty/null
Display the data part of the node Traverse the left subtree by recursively calling the pre-order function Traverse the right subtree by recursively calling the pre-order function Pre-order: F, B, A, D, C, E, G, I, H
37
Tree Traversal In-Order Check if current node is empty/null
Traverse the left subtree by recursively calling the in-order function Display the data part of the node Traverse the right subtree by recursively calling the in-order function In-order: A, B, C, D, E, F, G, H, I
38
Tree Traversal Post-Order Check if current node is empty/null
Traverse the left subtree by recursively calling the post-order function Traverse the right subtree by recursively calling the post-order function Display the data part of the node Post-order: A, C, E, D, B, H, I, G, F
39
Tree Traversal Breadth-First Level-order
Visit every node on a level before going to a lower level Level-order: F, B, G, A, d, I, C, E, H
40
Use of a stack It is very common to use a stack to keep track of:
nodes to be visited next, or nodes that we have already visited. Typically, use of a stack leads to a depth-first visit order. Depth-first visit order is “aggressive” in the sense that it examines complete paths.
41
Use of a queue It is very common to use a queue to keep track of:
nodes to be visited next, or nodes that we have already visited. Typically, use of a queue leads to a breadth-first visit order. Breadth-first visit order is “cautious” in the sense that it examines every path of length i before going on to paths of length i+1.
42
Spanning trees Suppose you have a connected undirected graph
Connected: every node is reachable from every other node Undirected: edges do not have an associated direction ...then a spanning tree of the graph is a connected subgraph in which there are no cycles A connected, undirected graph Four of the spanning trees of the graph
43
Finding a spanning tree
To find a spanning tree of a graph, pick an initial node and call it part of the spanning tree do a search from the initial node: each time you find a node that is not in the spanning tree, add to the spanning tree both the new node and the edge you followed to get to it An undirected graph One possible result of a BFS starting from top One possible result of a DFS starting from top
44
Minimizing costs Suppose you want to supply a set of houses (say, in a new subdivision) with: electric power water sewage lines telephone lines To keep costs down, you could connect these houses with a spanning tree (of, for example, power lines) However, the houses are not all equal distances apart To reduce costs even further, you could connect the houses with a minimum-cost spanning tree
45
Minimum-cost spanning trees
Suppose you have a connected undirected graph with a weight (or cost) associated with each edge The cost of a spanning tree would be the sum of the costs of its edges A minimum-cost spanning tree is a spanning tree that has the lowest cost A B E D F C 16 19 21 11 33 14 18 10 6 5 A connected, undirected graph A B E D F C 16 11 18 6 5 A minimum-cost spanning tree
46
Finding spanning trees
There are two basic algorithms for finding minimum-cost spanning trees, and both are greedy algorithms Kruskal’s algorithm: Start with no nodes or edges in the spanning tree, and repeatedly add the cheapest edge that does not create a cycle Here, we consider the spanning tree to consist of edges only Prim’s algorithm: Start with any one node in the spanning tree, and repeatedly add the cheapest edge, and the node it leads to, for which the node is not already in the spanning tree. Here, we consider the spanning tree to consist of both nodes and edges
47
Kruskal’s algorithm T = empty spanning tree; E = set of edges; N = number of nodes in graph; while T has fewer than N - 1 edges { remove an edge (v, w) of lowest cost from E if adding (v, w) to T would create a cycle then discard (v, w) else add (v, w) to T } Finding an edge of lowest cost can be done just by sorting the edges Efficient testing for a cycle requires a fairly complex algorithm (UNION-FIND) which we don’t cover in this course
48
Prim’s algorithm T = a spanning tree containing a single node s; E = set of edges adjacent to s; while T does not contain all the nodes { remove an edge (v, w) of lowest cost from E if w is already in T then discard edge (v, w) else { add edge (v, w) and node w to T add to E the edges adjacent to w } An edge of lowest cost can be found with a priority queue Testing for a cycle is automatic Hence, Prim’s algorithm is far simpler to implement than Kruskal’s algorithm
49
Mazes Typically, Every location in a maze is reachable from the starting location There is only one path from start to finish If the cells are “vertices” and the open doors between cells are “edges,” this describes a spanning tree Since there is exactly one path between any pair of cells, any cells can be used as “start” and “finish” This describes a spanning tree
50
Mazes as spanning trees
While not every maze is a spanning tree, most can be represented as such The nodes are “places” within the maze There is exactly one cycle-free path from any node to any other node
51
Building a maze I This algorithm requires two sets of cells
the set of cells already in the spanning tree, IN the set of cells adjacent to the cells in the spanning tree (but not in it themselves), called the FRONTIER Start with all walls present Pick any cell and put it into IN (red) • Put all adjacent cells, that aren’t in IN, into FRONTIER (blue)
52
Building a maze II Repeatedly do the following:
Remove any one cell C from FRONTIER and put it in IN Erase the wall between C and some one adjacent cell in IN Add to FRONTIER all the cells adjacent to C that aren’t in IN (or in FRONTIER already) • Continue until there are no more cells in FRONTIER • When the maze is complete (or at any time), choose the start and finish cells
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.