Download presentation
Presentation is loading. Please wait.
1
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances recursively 3. Obtain solution to original (larger) instance by combining these solutions
2
Divide-and-conquer technique subproblem 2 of size n/2 subproblem 1 of size n/2 a solution to subproblem 1 a solution to the original problem a solution to subproblem 2 a problem of size n
3
Divide and Conquer Examples Sorting: mergesort and quicksort Sorting: mergesort and quicksort Tree traversals Tree traversals Binary search Binary search Matrix multiplication-Strassen’s algorithm Matrix multiplication-Strassen’s algorithm Convex hull-QuickHull algorithm Convex hull-QuickHull algorithm
4
General Divide and Conquer recurrence: T(n) = aT(n/b) + f (n) where f (n) = Θ(n k ) 1. a < b k T(n) = Θ(n k ) 2. a = b k T(n) = Θ(n k lg n ) 3. a > b k T(n) = Θ( n log b a )
5
Mergesort Algorithm: Split array A[1..n] in two and make copies of each half Split array A[1..n] in two and make copies of each half in arrays B[1.. n/2 ] and C[1.. n/2 ] in arrays B[1.. n/2 ] and C[1.. n/2 ] Sort arrays B and C Sort arrays B and C Merge sorted arrays B and C into array A Merge sorted arrays B and C into array A
6
Mergesort Algorithm: Merge sorted arrays B and C into array A as follows: Merge sorted arrays B and C into array A as follows: –Repeat the following until no elements remain in one of the arrays: compare the first elements in the remaining unprocessed portions of the arrays compare the first elements in the remaining unprocessed portions of the arrays copy the smaller of the two into A, while incrementing the index indicating the unprocessed portion of that array copy the smaller of the two into A, while incrementing the index indicating the unprocessed portion of that array –Once all elements in one of the arrays are processed, copy the remaining unprocessed elements from the other array into A.
7
How Merging Works A: 7 2 1 6 4 9 5 B: 7 2 1 6 C: 4 9 5 Split The List B: 1 2 6 7 C: 4 5 9 Sort Each List Merge the Lists B: 1 245679
8
Putting it Together A: 7 2 1 6 4 9 5 B: 7 2 1 6 C: 4 9 5 Split The List B: 1 2 6 7 C: 4 5 9 Sort Each List Each List is sorted by recursively applying mergesort to the sub-lists D: 7 2 E: 1 6 F: 4 9 G: 5 Split Again H: 7 I: 2 J: 1 K: 6 L: 4 M: 9 N: 5
9
Efficiency of mergesort All cases have same time efficiency: Θ( n log n) All cases have same time efficiency: Θ( n log n) Number of comparisons is close to theoretical minimum for comparison-based sorting: Number of comparisons is close to theoretical minimum for comparison-based sorting: log (n !) ≈ n log n - 1.44 n log (n !) ≈ n log n - 1.44 n Space requirement: Θ( n ) (NOT in-place) Space requirement: Θ( n ) (NOT in-place) Can be implemented without recursion (bottom-up) Can be implemented without recursion (bottom-up)
10
Quicksort Select a pivot (partitioning element) Select a pivot (partitioning element) Rearrange the list so that all the elements in the positions before the pivot are smaller than or equal to the pivot and those after the pivot are larger than the pivot (See algorithm Partition in section 4.2) Rearrange the list so that all the elements in the positions before the pivot are smaller than or equal to the pivot and those after the pivot are larger than the pivot (See algorithm Partition in section 4.2) Exchange the pivot with the last element in the first (i.e., ≤ sublist) – the pivot is now in its final position Exchange the pivot with the last element in the first (i.e., ≤ sublist) – the pivot is now in its final position Sort the two sublists Sort the two sublists p A[i]≤p A[i]>p
11
The partition algorithm Example: 8 1 12 2 6 10 14 15 4 13 9 11 3 7 5
12
Efficiency of quicksort Best case: split in the middle Θ( n log n) Best case: split in the middle Θ( n log n) Worst case: sorted array! Θ( n 2 ) Worst case: sorted array! Θ( n 2 ) Average case: random arrays Θ( n log n) Average case: random arrays Θ( n log n)
13
Efficiency of quicksort Improvements: Improvements: –better pivot selection: median of three partitioning avoids worst case in sorted files –switch to insertion sort on small subfiles –elimination of recursion these combine to 20-25% improvement Considered the method of choice for internal sorting for large files (n ≥ 10000) Considered the method of choice for internal sorting for large files (n ≥ 10000)
14
QuickHull Algorithm Inspired by Quicksort compute Convex Hull: Assume points are sorted by x-coordinate values Assume points are sorted by x-coordinate values Identify extreme points P 1 and P 2 (part of hull) Identify extreme points P 1 and P 2 (part of hull) P1P1 P2P2
15
QuickHull Algorithm Compute upper hull: Compute upper hull: –find point P max that is farthest away from line P 1 P 2 –compute the hull to the left of line P 1 P max –compute the hull to the right of line P 2 P max Compute lower hull in a similar manner Compute lower hull in a similar manner P1P1 P2P2 P max
16
Efficiency of QuickHull algorithm Finding point farthest away from line P 1 P 2 can be done in linear time Finding point farthest away from line P 1 P 2 can be done in linear time This gives same efficiency as quicksort: This gives same efficiency as quicksort: –Worst case: Θ( n 2 ) –Average case: Θ( n log n)
17
Efficiency of QuickHull algorithm If points are not initially sorted by x- coordinate value, this can be accomplished in Θ( n log n) — no increase in asymptotic efficiency class If points are not initially sorted by x- coordinate value, this can be accomplished in Θ( n log n) — no increase in asymptotic efficiency class Other algorithms for convex hull: Other algorithms for convex hull: –Graham’s scan –DCHull also in Θ( n log n)
18
Closest-Pair Problem: Divide and Conquer Brute force approach requires comparing every point with every other point Brute force approach requires comparing every point with every other point Given n points, we must perform 1 + 2 + 3 + … + n-2 + n-1 comparisons. Given n points, we must perform 1 + 2 + 3 + … + n-2 + n-1 comparisons. Brute force O(n 2 ) Brute force O(n 2 ) The Divide and Conquer algorithm yields O(n log n) The Divide and Conquer algorithm yields O(n log n) Reminder: if n = 1,000,000 then Reminder: if n = 1,000,000 then n 2 = 1,000,000,000,000 whereas n 2 = 1,000,000,000,000 whereas n log n = 20,000,000 n log n = 20,000,000
19
Closest-Pair Algorithm Given: A set of points in 2-D
20
Closest-Pair Algorithm Step 1: Sort the points in one D
21
Lets sort based on the X-axis O(n log n) using quicksort or mergesort 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm
22
Step 2: Split the points, i.e., Draw a line at the mid-point between 7 and 8 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm Sub-Problem 1Sub-Problem 2
23
Advantage: Normally, we’d have to compare each of the 14 points with every other point. (n-1)n/2 = 13*14/2 = 91 comparisons 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm Sub-Problem 1Sub-Problem 2
24
Advantage: Now, we have two sub-problems of half the size. Thus, we have to do 6*7/2 comparisons twice, which is 42 comparisons 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm d1 d2 Sub-Problem 1Sub-Problem 2 solution d = min(d1, d2)
25
Advantage: With just one split we cut the number of comparisons in half. Obviously, we gain an even greater advantage if we split the sub-problems. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm d1 d2 Sub-Problem 1Sub-Problem 2 d = min(d1, d2)
26
Problem: However, what if the closest two points are each from different sub-problems? 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm d1 d2 Sub-Problem 1Sub-Problem 2
27
Here is an example where we have to compare points from sub-problem 1 to the points in sub- problem 2. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm d1 d2 Sub-Problem 1Sub-Problem 2
28
However, we only have to compare points inside the following “strip.” 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm d1 d2 Sub-Problem 1Sub-Problem 2 dd d = min(d1, d2)
29
Step 3: But, we can continue the advantage by splitting the sub-problems. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm
30
Step 3: In fact we can continue to split until each sub-problem is trivial, i.e., takes one comparison. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm
31
Finally: The solution to each sub-problem is combined until the final solution is obtained 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm
32
Finally: On the last step the ‘strip’ will likely be very small. Thus, combining the two largest sub- problems won’t require much work. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm
33
1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm In this example, it takes 22 comparisons to find the closets-pair. The brute force algorithm would have taken 91 comparisons. But, the real advantage occurs when there are millions of points.
34
Closest-Pair Problem: Divide and Conquer Here is another animation: Here is another animation: http://www.cs.mcgill.ca/~cs251/ClosestPair/Close stPairApplet/ClosestPairApplet.html http://www.cs.mcgill.ca/~cs251/ClosestPair/Close stPairApplet/ClosestPairApplet.html http://www.cs.mcgill.ca/~cs251/ClosestPair/Close stPairApplet/ClosestPairApplet.html http://www.cs.mcgill.ca/~cs251/ClosestPair/Close stPairApplet/ClosestPairApplet.html
35
Remember Homework is due on Friday Homework is due on Friday –In class and on paper There is a talk today: There is a talk today: –4PM RB 340 or 328 –Darren Lim (faculty candidate) –Bioinformatics: Secondary Structure Prediction in Proteins
36
Long Term HW7 will be given on Friday. HW7 will be given on Friday. It’ll be due on Wed. It’ll be due on Wed. It’ll be returned on Friday. It’ll be returned on Friday. Exam2 will be on Monday the 22 nd Exam2 will be on Monday the 22 nd
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.