Download presentation
Presentation is loading. Please wait.
Published byMilton Goodwin Modified over 9 years ago
1
242-535 ADA: 4. Divide/Conquer1 Objective o look at several divide and conquer examples (merge sort, binary search), and 3 approaches for calculating their running time Algorithm Design and Analysis (ADA) 242-535, Semester 1 2014-2015 4. Divide and Conquer
2
242-535 ADA: 4. Divide/Conquer2 1.Divide and Conquer 2.A Faster Sort: merge sort 3.The Iteration Method 4.Recursion Trees 5.Merge Sort vs Insertion Sort 6.Binary Search 7.Recursion Tree Examples 8.Iteration Method Examples 9.The Master MethodOverview
3
242-535 ADA: 4. Divide/Conquer3 1.Divide the problem into subproblems 2.Conquer the subproblems by solving them recursively 3.Combine subproblem solutions. 1. Divide and Conquer
4
2. A Faster Sort: Merge Sort M ERGE S ORT( A, left, right) 1.If left < right, // if left ≥ right, do nothing 2. mid := floor(left+right)/2) 3. MergeSort(A, left, mid) 4. MergeSort( A, mid+1,right) 5. Merge(A, left, mid, right) 6.return Initial call: M ERGE S ORT (A, 1, n)
5
242-535 ADA: 4. Divide/Conquer5 A faster sort: MergeSort A[1.. mid ] A[ mid +1.. n] input A[1.. n] M ERGE S ORT Sorted A[1.. mid ] Sorted A[ mid +1.. n] M ERGE output
6
242-535 ADA: 4. Divide/Conquer6 merge Tracing MergeSort()
7
20 13 7 2 12 11 9 1 Merging two sorted arrays The merge() function 2 7 13 201 9 11 12 1 2 7 9 11 12 13 20
8
1 20 13 7 2 12 11 9 Merging two sorted arrays
9
1 20 13 7 2 12 11 9 20 13 7 2 12 11 9 Merging two sorted arrays
10
20 13 7 2 12 11 9 1 1 20 13 7 2 12 11 9 2
11
Merging two sorted arrays 20 13 7 2 12 11 9 1 1 20 13 7 2 12 11 9 2 20 13 7 12 11 9
12
Merging two sorted arrays 20 13 7 2 12 11 9 1 1 20 13 7 2 12 11 9 2 20 13 7 12 11 9 7
13
Merging two sorted arrays 20 13 7 2 12 11 9 1 1 20 13 7 2 12 11 9 2 20 13 7 12 11 9 7 20 13 12 11 9
14
Merging two sorted arrays 20 13 7 2 12 11 9 1 1 20 13 7 2 12 11 9 2 20 13 7 12 11 9 7 20 13 12 11 9 9
15
Merging two sorted arrays 20 13 7 2 12 11 9 1 1 20 13 7 2 12 11 9 2 20 13 7 12 11 9 7 20 13 12 11 9 9 20 13 12 11
16
Merging two sorted arrays 20 13 7 2 12 11 9 1 1 20 13 7 2 12 11 9 2 20 13 7 12 11 9 7 20 13 12 11 9 9 20 13 12 11
17
Merging two sorted arrays 20 13 7 2 12 11 9 1 1 20 13 7 2 12 11 9 2 20 13 7 12 11 9 7 20 13 12 11 9 9 20 13 12 11 20 13 12
18
Merging two sorted arrays 20 13 7 2 12 11 9 1 1 20 13 7 2 12 11 9 2 20 13 7 12 11 9 7 20 13 12 11 9 9 20 13 12 11 20 13 12 1 2 7 9 11 12 13 20
19
Merging two sorted arrays 20 13 7 2 12 11 9 1 1 20 13 7 2 12 11 9 2 20 13 7 12 11 9 7 20 13 12 11 9 9 20 13 12 11 20 13 12 Time = one pass through each array = (n) to merge a total of n elements (linear time).
20
242-535 ADA: 4. Divide/Conquer20 StatementEffort Analysis of Merge Sort MergeSort(A, left, right) T(n) if (left < right) {O(1) mid = floor((left+right)/2); O(1) MergeSort(A, left, mid);T(n/2) MergeSort(A, mid+1, right);T(n/2) Merge(A, left, mid, right);O(n) } As shown on the previous slides
21
242-535 ADA: 4. Divide/Conquer21 merge(A, left, mid, right) Merges two adjacent subranges of an array A o left == the index of the first element of the first range o mid == the index of the last element of the first range o right == to the index of the last element of the second range merge() Code
22
242-535 ADA: 4. Divide/Conquer22 void merge(int[] A, int left, int mid, int right) { int[] temp = new int[right–left + 1]; int aIdx = left; int bIdx = mid+1; for (int i=0; i < temp.length; i++){ if(aIdx > mid) temp[i] = A[bIdx++]; // copy 2nd range else if (bIdx > right) temp[i] = A[aIdx++]; // copy 1st range else if (a[aIdx] <= a[bIdx]) temp[i] = A[aIdx++]; else temp[i] = A[bIdx++]; } // copy back into A for (int j = 0; j < temp.length; j++) A[left+j] = temp[j]; }
23
242-535 ADA: 4. Divide/Conquer23 Up to now, we have been solving recurrences using the Iteration method o Write T() as a recursive equation using big-Oh o Convert T() equation into algebra (replace O()'s) o Expand the recurrence o Rewrite the recursion into a summation o Convert algebra back to O() 3. The Iteration Method
24
242-535 ADA: 4. Divide/Conquer24 Recursive T() equation: o T(1) = O(1) o T(n) = 2T(n/2) + O(n), for n > 1 Convert to algebra o T(1) = a o T(n) = 2T(n/2) + cn MergeSort Running Time
25
242-535 ADA: 4. Divide/Conquer25 The expression: is called a recurrence. A recurrence is an equation that describes a function in terms of its value for smaller function calls. Recurrence for Merge Sort
26
242-535 ADA: 4. Divide/Conquer26 T(n) = 2T(n/2) + cn 2(2T(n/2/2) + cn/2) + cn 2 2 T(n/2 2 ) + cn2/2 + cn 2 2 T(n/2 2 ) + cn(2/2 + 1) 2 2 (2T(n/2 2 /b) + cn/2 2 ) + cn(2/2 + 1) 2 3 T(n/2 3 ) + cn(2 2 /2 2 ) + cn(2/2 + 1) 2 3 T(n/2 3 ) + cn(2 2 /2 2 +2/2 + 1) … 2 k T(n/2 k ) + cn(2 k-1 /2 k-1 + 2 k-2 /2 k-2 + … + 2 2 /2 2 + 2/2 + 1)
27
242-535 ADA: 4. Divide/Conquer27 So we have o T(n) = 2 k T( n/2 k ) + cn(2 k-1 /2 k-1 +... + 2 2 /2 2 + 2/2 + 1) For k = log 2 n o n = 2 k, so T() argument becomes 1 o T(n)= 2 k T( 1 ) + cn(k-1+1) = na + cn(log 2 n) = O(n) + O(n log 2 n) = O(n log 2 n) k-1 of these
28
242-535 ADA: 4. Divide/Conquer28 A graphical technique for finding a big-oh solution to a recurrence o Draw a tree of recursive function calls o Each tree node gets assigned the big-oh work done during its call to the function. o The big-oh equation is the sum of work at all the nodes in the tree. 4. Recursion Trees
29
MergeSort Recursion Tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. We usually omit stating the base case because our algorithms always run in time (1) when n is a small constant.
30
MergeSort Recursion Tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. T(n)T(n)
31
MergeSort Recursion Tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. T(n/2) cn
32
MergeSort Recursion Tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn T(n/4) cn/2
33
MergeSort Recursion Tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/4 cn/2 (1) … h = log n cn #leaves = n (n)(n) Total (n log n) …
34
242-535 ADA: 4. Divide/Conquer34 Height and no. of Leaves h steps why?
35
242-535 ADA: 4. Divide/Conquer35 Logarithm Equalities because of this
36
242-535 ADA: 4. Divide/Conquer36 O(n lg n) grows more slowly than O( n 2 ). In other words, merge sort is asymptotically faster (runs faster) than insertion sort in the worst case. In practice, merge sort beats insertion sort for n > 30 or so. 5. Merge Sort vs Insertion Sort
37
242-535 ADA: 4. Divide/Conquer37 Running time estimates: o Laptop executes 10 8 compares/second. o Supercomputer executes 10 12 compares/second. Timing Comparisons Lesson 1. Good algorithms are better than supercomputers.
38
242-535 ADA: 4. Divide/Conquer38 Binary Search from part 3 is a divide and conquer algorithm. Find an element in a sorted array: 1.Divide: Check middle element. 2.Conquer: Recursively search 1 subarray. 3.Combine: Easy; return index 6. Binary Search Example: Find 9 357891215
39
Binary Search Example: Find 9 357891215 Find an element in a sorted array: 1.Divide: Check middle element. 2.Conquer: Recursively search 1 subarray. 3.Combine: Trivial.
40
Binary Search Example: Find 9 357891215 Find an element in a sorted array: 1.Divide: Check middle element. 2.Conquer: Recursively search 1 subarray. 3.Combine: Trivial.
41
Binary Search Example: Find 9 357891215 Find an element in a sorted array: 1.Divide: Check middle element. 2.Conquer: Recursively search 1 subarray. 3.Combine: Trivial.
42
Binary Search Example: Find 9 357891215 Find an element in a sorted array: 1.Divide: Check middle element. 2.Conquer: Recursively search 1 subarray. 3.Combine: Trivial.
43
Binary Search Find an element in a sorted array: 1.Divide: Check middle element. 2.Conquer: Recursively search 1 subarray. 3.Combine: Trivial. Example: Find 9 357891215
44
242-535 ADA: 4. Divide/Conquer44 Binary Search Code (again) int binSrch(char A[], int i,int j, char key) { int k; if (i > j) /* key not found */ return -1; k = (i+j)/2; if (key == A[k]) /* key found */ return k; if (key < A[k]) j = k-1; /* search left half */ else i = k+1; /* search right half */ return binSrch(A, i, j, key); }
45
242-535 ADA: 4. Divide/Conquer45 Using big-oh. o Basis: T(1) = O(1) o Induction: T(n) = O(1) + T( ), for n > 1 As algebra o Basis: T(1) = a o Induction: T(n) = c + T( ), for n > 1 Running time for binary search is O(log 2 n). Running Time (again) n == the range of the array being looked at
46
Recurrence for Binary Search T(n) = 1 T(n/2) + (1) # subproblems subproblem size work dividing and combining
47
BS Recursion tree Solve T(n) = T(n/2) + c, where c > 0 is constant. We usually don't bother with the base case because our algorithms always run in time (1) when n is a small constant.
48
BS Recursion tree Solve T(n) = T(n/2) + c, where c > 0 is constant. T(n)T(n)
49
BS Recursion tree Solve T(n) = T(n/2) + c, where c > 0 is constant. c T(n/2)
50
BS Recursion tree Solve T(n) = T(n/2) + c, where c > 0 is constant. c c T(n/4)
51
BS Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. (1) h = log 2 n c c c a Total = c log 2 n + a = O(log 2 n) … c c c …
52
242-535 ADA: 4. Divide/Conquer52 Merge Sort T(n) = 2T(n/2) + (n) = (n log n) Binary Search T(n) = T(n/2) + (1) = (log n) The big-oh running times were calculated in two ways: the iteration method and using recursion trees. Let's do some more example of both. Two recurrences so far
53
7. Recursion Tree Examples 1 Solve T(n) = T(n/4) + T(n/2) + n 2 :
54
Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : T(n)
55
Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : n2n2 T(n/4) T(n/2)
56
Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : n2n2 (n/4) 2 (n/2) 2 T(n/16)T(n/8) T(n/4)
57
Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : n2n2 (n/4) 2 (n/2) 2 (n/16) 2 (n/8) 2 (n/4) 2 O (1)
58
Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : n2n2 (n/4) 2 (n/16) 2 (n/8) 2 O (1) 2 n (n/2) 2 (n/8) 2 (n/4) 2
59
Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : (n/4) 2 (n/16) 2 (n/8) 2 O (1) n2n2 (n/2) 2 (n/8) 2 (n/4) 2 5 16 2 n 2 n
60
Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : (n/4) 2 (n/16) 2 (n/8) 2 O (1) n2n2 (n/2) 2 (n/8) 2 (n/4) 2 5 16 25 256 2 n 2 n 2 n
61
Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : (n/4) 2 (n/16) 2 (n/8) 2 n2n2 (n/2) 2 (n/8) 2 (n/4) 2 5 16 25 256 2 n 2 n 2 n O (1) 2 Total = n ( 1 + = O(n 2 ) 5 16 2 3 5 + ( ) + ( ) + ) 16 geometric series = 16/11*n 2
62
Geometric Series Reminder for |x| < 1 for x 1
63
242-535 ADA: 4. Divide/Conquer63 T(n) = 3T(n/4) + cn 2 Recursion Tree 2
64
242-535 ADA: 4. Divide/Conquer64 T(n) = 3T(n/4) + cn 2
65
242-535 ADA: 4. Divide/Conquer65 height (h) = no. of leaves = T(n) = 3T(n/4) + cn 2
66
242-535 ADA: 4. Divide/Conquer66 Height and no. of Leaves h steps why?
67
242-535 ADA: 4. Divide/Conquer67 Add the cost of all the levels: Cost of the Tree leaves level leaves level next to bottom level next to bottom level
68
242-535 ADA: 4. Divide/Conquer68 T(n) = T(n/3) + T(2n/3) + cn Recursion Tree 3 height =
69
242-535 ADA: 4. Divide/Conquer69 Height and no. of Leaves h steps for the longest path why?
70
242-535 ADA: 4. Divide/Conquer70 Since the tree is smaller than a complete binary tree, then the cost of all the level will be: T(n) ≤ cn * log 3/2 n T(n) is O(n log 3/2 n) is O(n log n) Since log 3/2 n = log 2 n / log 2 3/2 = c log 2 n // see slide 35 Cost of the Tree
71
242-535 ADA: 4. Divide/Conquer71 8. Iteration Method Examples 1 1 2 2 3 3
72
242-535 ADA: 4. Divide/Conquer72 T(n) = c + T(n-1) = c + c + T(n-2) = 2c + T(n-2) = 2c + c + T(n-3) = 3c + T(n-3) … = kc + T(n-k) = ck + T(n-k) Example 1
73
242-535 ADA: 4. Divide/Conquer73 When k == n o T(n) = cn + T(0) = cn The conversion back to big-oh: o T(n) is O(n)
74
242-535 ADA: 4. Divide/Conquer74 T(n) =n + T(n-1) =n + n-1 + T(n-2) =n + n-1 + n-2 + T(n-3) =n + n-1 + n-2 + n-3 + T(n-4) =…=… = n + n-1 + n-2 + n-3 + … + n-(k-1) + T(n-k) Example 2
75
242-535 ADA: 4. Divide/Conquer75 T(n) =n + T(n-1) =n + n-1 + T(n-2) =n + n-1 + n-2 + T(n-3) =n + n-1 + n-2 + n-3 + T(n-4) =… = n + n-1 + n-2 + n-3 + … + n-(k-1) + T(n-k) =
76
242-535 ADA: 4. Divide/Conquer76 When k = n, T(n) = In general, T(n) is O(n 2 )
77
242-535 ADA: 4. Divide/Conquer77 T(n) = aT(n/b) + cn a(aT(n/b/b) + cn/b) + cn a 2 T(n/b 2 ) + cna/b + cn a 2 T(n/b 2 ) + cn(a/b + 1) a 2 (aT(n/b 2 /b) + cn/b 2 ) + cn(a/b + 1) a 3 T(n/b 3 ) + cn(a 2 /b 2 ) + cn(a/b + 1) a 3 T(n/b 3 ) + cn(a 2 /b 2 + a/b + 1) … a k T(n/b k ) + cn(a k-1 /b k-1 + a k-2 /b k-2 + … + a 2 /b 2 + a/b + 1) Example 3
78
242-535 ADA: 4. Divide/Conquer78 So we have o T(n) = a k T(n/b k ) + cn(a k-1 /b k-1 +... + a 2 /b 2 + a/b + 1) For k = log b n o n = b k o T(n)= a k T(1) + cn(a k-1 /b k-1 +... + a 2 /b 2 + a/b + 1) = a k d + cn(a k-1 /b k-1 +... + a 2 /b 2 + a/b + 1) ~= ca k + cn(a k-1 /b k-1 +... + a 2 /b 2 + a/b + 1) = cna k /b k + cn(a k-1 /b k-1 +... + a 2 /b 2 + a/b + 1) = cn(a k /b k +... + a 2 /b 2 + a/b + 1)
79
242-535 ADA: 4. Divide/Conquer79 With k = log b n o T(n) = cn(a k /b k +... + a 2 /b 2 + a/b + 1) There are three cases at this stage depending on if a == b, a b If a == b o T(n)= cn(k + 1) = cn(log b n + 1) = O(n log b n) Case 1
80
242-535 ADA: 4. Divide/Conquer80 With k = log b n o T(n) = cn(a k /b k +... + a 2 /b 2 + a/b + 1) If a < b o Recall that (x k + x k-1 + … + x + 1) = (x k+1 -1)/(x-1) // slide 62 o So: o T(n) = cn * O(1) = O(n) Case 2
81
242-535 ADA: 4. Divide/Conquer81 With k = log b n o T(n) = cn(a k /b k +... + a 2 /b 2 + a/b + 1) If a > b? Case 3
82
242-535 ADA: 4. Divide/Conquer82 why?
83
242-535 ADA: 4. Divide/Conquer83 So… e.g. merge sort (a = b = 2 and c = 1) e.g. merge sort (a = b = 2 and c = 1)
84
9. The Master Method The master method only applies to divide and conquer recurrences of the form: T(n) = a T(n/b) + f (n) where a 1, b > 1, and f (n) > 0 for all n > n 0 The Master method gives us a cookbook solution for an algorithm’s running time plug in the numbers, get the equation this is a more general version of the last example this is a more general version of the last example
85
242-535 ADA: 4. Divide/Conquer85 >a>a When T(n) = aT(n/b) + f(n) then Three cases Case 1 Case 2 Case 3 <a<a = n log b a == no. of leaves in the recursion tree (see next slides) n log b a == no. of leaves in the recursion tree (see next slides) note: n is a polynomial
86
Example 1 E X.T(n) = 4T(n/2) + n a = 4, b = 2 so n log b a = n 2 ; f (n) = n. C ASE 1 since f(n) < n log b a (n < n 2 ) T(n) is (n 2 )
87
Example 2 E X.T(n) = 4T(n/2) + n 2 a = 4, b = 2 so n log b a = n 2 ; f (n) = n 2. C ASE 2 since f(n) is same as n log b a (n 2 = n 2 ) T(n) is (n 2 * log n)
88
Example 3 E X. T(n) = 4T(n/2) + n 3 a = 4, b = 2 so n log b a = n 2 ; f (n) = n 3. C ASE 3 since f(n) > n log b a (n 3 > n 2 ) and 4(n/2) 3 cn 3 (reg. cond.) for c = 1/2 T(n) is (n 3 )
89
Example 4 (fail) E X.T(n) = 4T(n/2) + n 2 /log n a = 4, b = 2 so n log b a = n 2 ; f (n) = n 2 / log n. The master method does not apply because n 2 /log n ≠ n 2- for any. f(n) must be a simple polynominal function for the master method to be applicable
90
242-535 ADA: 4. Divide/Conquer90 Example 5
91
242-535 ADA: 4. Divide/Conquer91 Common Examples Cannot use Master method since f(n) is not a polynomial
92
n log b a (1) f (n/b) (1) … … f (n)f (n) a f (n/b 2 ) … a h = log b n f (n)f (n) a f (n/b) a 2 f (n/b 2 ) #leaves = n log b a … Recursion Tree for Master T() T(n) = aT(n/b) + f(n)
93
242-535 ADA: 4. Divide/Conquer93 Height and no. of Leaves h steps why?
94
f (n/b) (1) … … f (n)f (n) a f (n/b 2 ) … a h = log b n f (n)f (n) a f (n/b) a 2 f (n/b 2 ) The sums increase geometrically from the root to the leaves. The leaves hold the biggest part of the total sum. (n log b a ) … n log b a (1) Case 1 Explained The – means that f(n) is smaller than leaf sum.
95
f (n/b) (1) … … f (n)f (n) a f (n/b 2 ) … a h = log b n f (n)f (n) a f (n/b) a 2 f (n/b 2 ) The sums are approximately the same on each of the levels (Θ(total of all sums)). (n log b a * log n) … n log b a (1) Case 2 Explained No means that f(n) is roughly equal to the leaf sum.
96
f (n/b) (1) … … f (n)f (n) a f (n/b 2 ) … a h = log b n f (n)f (n) a f (n/b) a 2 f (n/b 2 ) … The sums decrease geometrically from the root to the leaves. The root holds the biggest part of the total sum. n log b a (1) ( f (n)) Case 3 Explained af(n/b) is getting smaller at lower levels (see def n )
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.