Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 4: Divide and Conquer

Similar presentations


Presentation on theme: "Chapter 4: Divide and Conquer"— Presentation transcript:

1 Chapter 4: Divide and Conquer
The Design and Analysis of Algorithms Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees

2 Chapter 4. Divide and Conquer Algorithms
Basic Idea Merge Sort Quick Sort Binary Search Binary Tree Algorithms Closest Pair and Quickhull Conclusion

3 Basic Idea Divide instance of problem into two or more smaller instances Solve smaller instances recursively Obtain solution to original (larger) instance by combining these solutions

4 Basic Idea a problem of size n subproblem 1 subproblem 2 of size n/2
a solution to the original problem a problem of size n

5 General Divide-and-Conquer Recurrence
T(n) = aT(n/b) + f (n) where f(n)  (nd), d  0 Master Theorem: If a < bd, T(n)  (nd) If a = bd, T(n)  (nd log n) If a > bd, T(n)  (nlog b a ) Examples: T(n) = T(n/2) + n  T(n)  (n ) Here a = 1, b = 2, d = 1, a < bd T(n) = 2T(n/2) + 1  T(n)  (n log 2 2 ) = (n ) Here a = 2, b = 2, d = 0, a > bd

6 Examples T(n) = T(n/2) + 1  T(n)  (log(n) )
Here a = 1, b = 2, d = 0, a = bd T(n) = 4T(n/2) + n  T(n)  (n log 2 4 ) = (n 2 ) Here a = 4, b = 2, d = 1, a > bd T(n) = 4T(n/2) + n2  T(n)  (n2 log n) Here a = 4, b = 2, d = 2, a = bd T(n) = 4T(n/2) + n3  T(n)  (n3) Here a = 4, b = 2, d = 3, a < bd

7 Recurrence Examples

8 Recursion Tree for T(n)=3T(n/4)+(n2)
cn2 cn2 c(n/4)2 T(n/16) (c) T(n/4) T(n/4) T(n/4) (a) (b) cn2 cn2 (3/16)cn2 c(n/4)2 c(n/4)2 c(n/4)2 log 4n (3/16)2cn2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 (nlog 43) T(1) T(1) T(1) T(1) T(1) T(1) 3log4n= nlog 43 Total O(n2) (d)

9 Solution to T(n)=3T(n/4)+(n2)
The height is log 4n, #leaf nodes = 3log 4n= nlog 43. Leaf node cost: T(1). Total cost T(n)=cn2+(3/16) cn2+(3/16)2 cn2+  +(3/16)log 4n-1 cn2+ (nlog 43) =(1+3/16+(3/16)2+  +(3/16)log 4n-1) cn2 + (nlog 43) <(1+3/16+(3/16)2+  +(3/16)m+ ) cn2 + (nlog 43) =(1/(1-3/16)) cn2 + (nlog 43) =16/13cn2 + (nlog 43) =O(n2).

10 Recursion Tree of T(n)=T(n/3)+ T(2n/3)+O(n)

11 Mergesort Split array A[0..n-1] in two about equal halves and make copies of each half in arrays B and C Sort arrays B and C recursively Merge sorted arrays B and C into array A

12 Mergesort

13 Mergesort

14 Mergesort

15 Mergesort Code mergesort( int a[], int left, int right) {
if (right > left) middle = left + (right - left)/2; mergesort(a, left, middle); mergesort(a, middle+1, right); merge(a, left, middle, right); }

16 Analysis of Mergesort T(N) = 2T(N/2) + N
All cases have same efficiency: Θ(n log n) Space requirement: Θ(n) (not in-place) Can be implemented without recursion (bottom-up)

17 Features of Mergesort All cases: Θ(n log n)
Requires additional memory Θ(n) Recursive Not stable (does not preserve previous sorting) Never used for sorting in main memory, used for external sorting

18 Quicksort Let S be the set of N elements Quicksort(S):
If N = 0 or N = 1, return Pick an element v - pivot, in S Partition S - {v} into two disjoint sets: S1 = {x є S - {v}| x ≤ v}, and S2 = {x є S - {v}| x ≥ v} Return {quicksort(S1), followed by v, followed by quicksort(S2)}

19 Quicksort Select a pivot (partitioning element)
Median of three algorithm Take the first, the last and the middle elements and sort them (within their positions). Choose the median of these three elements. Hide the pivot – swap the pivot and the next to the last element. Example: After sorting Hide the pivot

20 Quicksort Rearrange the list so that all the elements in the first s positions are smaller than or equal to the pivot and all the elements in the remaining n-s-1 positions are larger than or equal to the pivot. Note: the pivot does not participate in rearranging the elements. Exchange the pivot with the first element in the second subarray — the pivot is now in its final position Sort the two subarrays recursively

21 left – the smallest valid index right – the largest valid index if( left + 10 <= right) { int i = left, j = right - 1; for ( ; ; ) while (a[++i] < pivot) {} while (pivot < a[--j] ) {} if (i < j) swap (a[i],a[j]); else break; } swap (a[i], a[right-1]); quicksort ( a, left, i-1); quicksort (a, i+1, right); else insertionsort (a, left, right);

22 Analysis of Quicksort Best case: split in the middle — Θ(n log n)
T(N) = 2T(N/2) + N Worst case: sorted array — Θ(n2) T(N) = T(N-1) + N Average case: random arrays — Θ(n log n) Quicksort is considered the method of choice for internal sorting of large files (n ≥ 10000)

23 Features of Quicksort Best and average case: Θ(n log n) ,
Worst case: Θ(n2), it happens when the pivot is the smallest of the largest element In-place sorting, additional memory only for swapping Recursive Not stable Never used for life-critical and mission-critical applications, unless you assume the worst-case response time

24 Binary Search If K = A[m], stop (successful search);
Very efficient algorithm for searching in sorted array: K vs A[0] A[m] A[n-1] If K = A[m], stop (successful search); otherwise, continue searching by the same method in A[0..m-1] if K < A[m], and in A[m+1..n-1] if K > A[m]

25 Analysis of Binary Search
Time efficiency T(N) = T(N/2) + 1 Worst case: O(log(n)) Best case: O(1) Optimal for searching a sorted array Limitations: must be a sorted array (not linked list)

26 Binary Tree Algorithms
Binary tree is a divide-and-conquer ready structure Ex. 1: Classic traversals (preorder, inorder, postorder) Algorithm Inorder(T) if T   Inorder(Tleft) print(root of T) Inorder(Tright) Efficiency: Θ(n)

27 Binary Tree Algorithms
Ex. 2: Computing the height of a binary tree h(T) = max{ h(TL), h(TR) } + 1 if T   and h() = -1 Efficiency: Θ(n)

28 Recursion tree for T(n)=aT(n/b)+f(n)


Download ppt "Chapter 4: Divide and Conquer"

Similar presentations


Ads by Google