Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSE 326: Data Structures Lecture #19 Disjoint Sets Dynamic Equivalence Weighted Union & Path Compression David Kaplan davek@cs Today we’re going to get.

Similar presentations


Presentation on theme: "CSE 326: Data Structures Lecture #19 Disjoint Sets Dynamic Equivalence Weighted Union & Path Compression David Kaplan davek@cs Today we’re going to get."— Presentation transcript:

1 CSE 326: Data Structures Lecture #19 Disjoint Sets Dynamic Equivalence Weighted Union & Path Compression David Kaplan Today we’re going to get to know the dynamic duo of union/find. But first…

2 Today’s Outline Making a “good” maze Disjoint Set Union/Find ADT
Up-trees Maze revisited Weighted Union Path Compression An amazing complexity analysis We’ll finish up extendible hashing and summarize the advantages/disadvantages of hash tables. Then, let’s look at a motivating problem. Next, we’ll hit the ADT. Then the data structure and the dynamic duo.

3 The Maze Construction Problem
Represent maze environment as graph {V,E} collection of rooms: V connections between rooms (initially all closed): E Construct a maze: collection of rooms: V = V designated rooms in, iV, and out, oV collection of connections to knock down: E  E such that one unique path connects every two rooms Here’s a formalism which tries to approach that.

4 The Middle of the Maze So far, some walls have been knocked down while others remain. Now, we consider the wall between A and B. Should we knock it down? no, if A and B are otherwise connected yes, if A and B are not otherwise connected A B Let’s say we’re in the middle of knocking out walls to make a good maze. How do we decide whether to knock down the current wall?

5 Maze Construction Algorithm
While edges remain in E Remove a random edge e = (u, v) from E If u and v have not yet been connected add e to E mark u and v as connected

6 Equivalence Relations
An equivalence relation R has three properties: reflexive: for any x, xRx is true symmetric: for any x and y, xRy implies yRx transitive: for any x, y, and z, xRy and yRz implies xRz Connection between rooms is an equivalence relation (call it C) For any rooms a, b, c a C a (A room connects to itself!) If a C b, then b C a If a C b and b C c, then a C c Examples of other equivalence relations?

7 Disjoint Set Union/Find ADT
Union/Find operations create destroy union find Disjoint set equivalence property: every element of a DS U/F structure belongs to exactly one set Dynamic equivalence property: the set of an element can change after execution of a union {1,4,8} {7} {6} {5,9,10} {2,3} find(4) 8 union(2,6) {2,3,6}

8 Disjoint Set Union/Find More Formally
Given a set U = {a1, a2, … , an} Maintain a partition of U, a set of subsets of U {S1, S2, … , Sk} such that: each pair of subsets Si and Sj are disjoint: together, the subsets cover U: The Si are mutually exclusive and collectively exhaustive Union(a, b) creates a new subset which is the union of a’s subset and b’s subset Find(a) returns a unique name for a’s subset Note: we’ll assume that a union is actually passed the names of the subsets. So, there are no hidden finds inside a union. NOTE: set names are arbitrary! We only care that: Find(a) == Find(b)  a and b are in the same subset NOTE: outside agent must decide when/what to union; ADT is just the bookkeeper.

9 Example a d b e c f g h i Construct the maze on the right
3 2 4 11 10 1 7 9 6 8 12 5 Construct the maze on the right Initial state (set names underlined): {a}{b}{c}{d}{e}{f}{g}{h}{i} Maze constructor (outside agent) traverses edges in numeric order and decides whether to union

10 Example, First Step {a}{b}{c}{d}{e}{f}{g}{h}{i} find(b)  b
find(e)  e find(b)  find(e) so: add 1 to E union(b, e) {a}{b,e}{c}{d}{f}{g}{h}{i} a b c 3 2 4 11 10 1 7 9 6 8 12 5 d e f g h i Order of edges in blue

11 Up-Tree Intuition Finding the representative member of a set is somewhat like the opposite of finding whether a given item exists in a set. So, instead of using trees with pointers from each node to its children; let’s use trees with a pointer from each node to its parent.

12 Up-Tree Union-Find Data Structure
Each subset is an up-tree with its root as its representative member All members of a given set are nodes in that set’s up-tree Hash table maps input data to the node associated with that data a c g h d b f i e Up-trees are not necessarily binary!

13 Find find(f) find(e) a b c a c g h d e f d b f i g h i e
10 c a c g h d e 7 f 11 9 8 d b f i O(depth) g 12 h i e Just traverse to the root! runtime:

14 Union union(a,c) a b c a c g h d e f d b f i g h i e
10 c a c g h d e f 11 9 8 d b f i O(1) g 12 h i e Just hang one root from the other! runtime:

15 The Whole Example (1/11) a b c d e f union(b,e) g h i a b c d e f g h
3 b 10 c The Whole Example (1/11) 2 1 6 d 4 e 7 f 11 9 8 union(b,e) g 12 h 5 i a b c d e f g h i a b c d f g h i e

16 The Whole Example (2/11) a b c d e f union(a,d) g h i a b c d f g h i
3 b 10 c The Whole Example (2/11) 2 6 d 4 e 7 f 11 9 8 union(a,d) g 12 h 5 i a b c d f g h i e a b c f g h i d e

17 The Whole Example (3/11) a b c d e f union(a,b) g h i a b c f g h i d
10 c The Whole Example (3/11) 6 d 4 e 7 f 11 9 8 union(a,b) g 12 h 5 i a b c f g h i d e a c f g h i b d e

18 The Whole Example (4/11) a b c d e f find(d) = find(e) No union! g h i
10 c The Whole Example (4/11) 6 d 4 e 7 f 11 9 8 find(d) = find(e) No union! g 12 h 5 i f g h a b c i d e While we’re finding e, could we do anything else?

19 The Whole Example (5/11) a b c d e f union(h,i) g h i a c f g h i a c
10 c The Whole Example (5/11) 6 d e 7 f 11 9 8 union(h,i) g 12 h 5 i a c f g h i a c f g h b d i b d e e

20 The Whole Example (6/11) a b c d e f union(c,f) g h i a c f g h a c g
10 c The Whole Example (6/11) 6 d e 7 f 11 9 8 union(c,f) g 12 h i a c f g h a c g h b d i b d f i e e

21 The Whole Example (7/11) a b c d e f find(e) find(f) union(a,c) g h i
10 c The Whole Example (7/11) d e 7 f find(e) find(f) union(a,c) 11 9 8 g 12 h i f g h a b c i d e Could we do a better job on this union?

22 The Whole Example (8/11) a b c d e f find(f) find(i) union(c,h) g h i
10 c The Whole Example (8/11) d e f find(f) find(i) union(c,h) 11 9 8 g 12 h i c g h c g a f i a f h b d b d i e e

23 The Whole Example (9/11) a b c d e f
10 c The Whole Example (9/11) d e f find(e) = find(h) and find(b) = find(c) So, no unions for either of these. 11 9 g 12 h i c g a f h b d i e

24 The Whole Example (10/11) a b c d e f find(d) find(g) union(c, g) g h
12 h i c g g c a f h a f h b d i b d i e e

25 The Whole Example (11/11) a b c d e f find(g) = find(h) So, no union.
And, we’re done! g 12 h i g a b c c d e f a f h g h i b d i Ooh… scary! Such a hard maze! e

26 DS/DE data structure A forest of up-trees can easily be stored in an array. Also, if the node names are integers or characters, we can use a very simple, perfect hash. f g h a b c i d e Nifty storage trick! -1 1 2 7 0 (a) 1 (b) 2 (c) 3 (d) 4 (e) 5 (f) 6 (g) 7 (h) 8 (i) up-index:

27 Implementation runtime: O(depth) or … runtime: O(1) typedef ID int;
ID find(Object x) { assert(hTable.contains(x)); ID xID = hTable[x]; while(up[xID] != -1) { xID = up[xID]; } return xID; ID union(ID x, ID y) { assert(up[x] == -1); assert(up[y] == -1); up[y] = x; } Find runs in time linear in the maximum depth of an up-tree. What is the maximum depth? runtime: O(depth) or … runtime: O(1)

28 Room for Improvement: Weighted Union
Always makes the root of the larger tree the new root Often cuts down on height of the new up-tree f g h a b c i d e c g h a g h I use weighted because it makes the analysis a bit easier; your book suggests height-based union, and that’s probably fine. a f i b d c i b d e f Could we do a better job on this union? e Weighted union!

29 Weighted Union Code new runtime of union: new runtime of find:
typedef ID int; ID union(ID x, ID y) { assert(up[x] == -1); assert(up[y] == -1); if (weight[x] > weight[y]) { up[y] = x; weight[x] += weight[y]; } else { up[x] = y; weight[y] += weight[x]; new runtime of union: Union still takes constant time. Find hasn’t changed, so it still takes O(depth) But what is the maximum depth now? new runtime of find:

30 Weighted Union Find Analysis
Finds with weighted union are O(max up-tree height) But, an up-tree of height h with weighted union must have at least 2h nodes  2max height  n and max height  log n So, find takes O(log n) Base case: h = 0, tree has 20 = 1 node Induction hypothesis: assume true for h < h A merge can only increase tree height by one over the smaller tree. So, a tree of height h-1 was merged with a larger tree to form the new tree. Each tree then has  2h-1 nodes by the induction hypotheses for a total of at least 2h nodes. QED. When a node gets deeper, it only gets one deeper, and its up-tree must have been the smaller one in the merge. That means that its tree doubled in size. How many times can a tree double in size? Log n. So find takes O(log n).

31 Room for Improvement: Path Compression
Points everything along the path of a find to the root Reduces the height of the entire access path to 1 f g h a b c i d e a c f g h i b d e While we’re finding e, could we do anything else? Path compression!

32 Path Compression Example
find(e) c g c g a f h a f h Note that everything along the path got pointed to c. Note also that this is definitely not a binary tree! b e b d d i i e

33 Path Compression Code runtime: typedef ID int; ID find(Object x) {
assert(hTable.contains(x)); ID xID = hTable[x]; ID hold = xID; while(up[xID] != -1) { xID = up[xID]; } while(up[hold] != -1) { temp = up[hold]; up[hold] = xID; hold = temp; return xID; Find clearly still takes O(depth), right? But how deep can a tree get now? runtime:

34 Complexity of Weighted Union + Path Compression
Ackermann created a function A(x, y) which grows very fast! Inverse Ackermann function (x, y) grows very slooooowwwly … Single-variable inverse Ackermann function is called log* n How fast does log n grow? log n = 4 for n = 16 Let log(k) n = log (log (log … (log n))) Then, let log* n = minimum k such that log(k) n  1 How fast does log* n grow? log* n = 4 for n = 65536 log* n = 5 for n = (20,000 digit number!) How fast does (x, y) grow? Even sloooowwwer than log* n (x, y) = 4 for n far larger than the number of atoms in the universe (2300) k logs

35 Complex Complexity of Weighted Union + Path Compression
Tarjan proved that m weighted union and find operations on a set of n elements have worst case complexity O(m(m, n)) This is essentially amortized constant time In some practical cases, weighted union or path compression or both are unnecessary because trees do not naturally get very deep. OK, now, your book shows a proof that find now takes amortized O(log*n) but actually, it’s even better than that: O(alpha(m,n)) time. Basically amortized 1. Can anyone think why the alpha doesn’t matter even in any good theoretical sense? Remember B-Trees: they showed us that our model was bogus for large data sets. By the time alpha gets near 4, our data sets are already larger than we can even imagine.

36 Disjoint Set Union/Find ADT Summary
Simple ADT, simple data structure, simple code Complex complexity analysis, but extremely useful result: essentially, constant time! Lots of potential applications Object-property collections Index partitions: e.g. parts inspection To say nothing of maze construction In some applications, it may make sense to have meaningful (non-arbitrary) set names


Download ppt "CSE 326: Data Structures Lecture #19 Disjoint Sets Dynamic Equivalence Weighted Union & Path Compression David Kaplan davek@cs Today we’re going to get."

Similar presentations


Ads by Google