Presentation is loading. Please wait.

Presentation is loading. Please wait.

Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.

Similar presentations


Presentation on theme: "Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances."— Presentation transcript:

1 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances recursively 3. Obtain solution to original (larger) instance by combining these solutions

2 Divide-and-conquer technique subproblem 2 of size n/2 subproblem 1 of size n/2 a solution to subproblem 1 a solution to the original problem a solution to subproblem 2 a problem of size n

3 Divide and Conquer Examples Sorting: mergesort and quicksort O(n 2 )  O(n log n) Sorting: mergesort and quicksort O(n 2 )  O(n log n) Closest-pair algorithm O(n 2 )  O(n log n) Closest-pair algorithm O(n 2 )  O(n log n) Matrix multiplication Strassen’s O(n 3 )  O(n log 7 ) = O(n 2.8 ) Matrix multiplication Strassen’s O(n 3 )  O(n log 7 ) = O(n 2.8 ) Convex hull-QuickHull algorithm O(n 3 )  O(n 2 log n) Convex hull-QuickHull algorithm O(n 3 )  O(n 2 log n)

4 General Divide and Conquer recurrence: T(n) = aT(n/b) + f (n) where f (n) = Θ(n k ) 1. a < b k T(n) = Θ(n k ) 2. a = b k T(n) = Θ(n k lg n ) 3. a > b k T(n) = Θ( n log b a )

5 Efficiency of mergesort All cases have same time efficiency: Θ( n log n) All cases have same time efficiency: Θ( n log n) Number of comparisons is close to theoretical minimum for comparison-based sorting: Number of comparisons is close to theoretical minimum for comparison-based sorting: log (n !) ≈ n log n - 1.44 n log (n !) ≈ n log n - 1.44 n Space requirement: Θ( n ) (NOT in-place) Space requirement: Θ( n ) (NOT in-place) Can be implemented without recursion (bottom-up) Can be implemented without recursion (bottom-up)

6 Quicksort Select a pivot (partitioning element) Select a pivot (partitioning element) Partition the list into two halves Partition the list into two halves –First half: Items less than the pivot –Second half: Items greater than the pivot Exchange the pivot with the last element in the first half Exchange the pivot with the last element in the first half Sort the two sublists Sort the two sublists p A[i]≤p A[i]>p

7 Efficiency of quicksort Best case: split in the middle Θ( n log n) Best case: split in the middle Θ( n log n) Worst case: sorted array! Θ( n 2 ) Worst case: sorted array! Θ( n 2 ) Average case: random arrays Θ( n log n) Average case: random arrays Θ( n log n)

8 Efficiency of quicksort Improvements: Improvements: –better pivot selection: median of three partitioning avoids worst case in sorted files –switch to insertion sort on small subfiles –elimination of recursion these combine to 20-25% improvement Considered the method of choice for internal sorting for large files (n ≥ 10000) Considered the method of choice for internal sorting for large files (n ≥ 10000)

9 Efficiency of QuickHull algorithm If points are not initially sorted by x- coordinate value, this can be accomplished in Θ( n log n) — no increase in asymptotic efficiency class If points are not initially sorted by x- coordinate value, this can be accomplished in Θ( n log n) — no increase in asymptotic efficiency class Other algorithms for convex hull: Other algorithms for convex hull: –Graham’s scan –DCHull also in Θ( n log n)

10 Closest-Pair Problem: Divide and Conquer Brute force approach requires comparing every point with every other point Brute force approach requires comparing every point with every other point Given n points, we must perform 1 + 2 + 3 + … + n-2 + n-1 comparisons. Given n points, we must perform 1 + 2 + 3 + … + n-2 + n-1 comparisons. Brute force  O(n 2 ) Brute force  O(n 2 ) The Divide and Conquer algorithm yields  O(n log n) The Divide and Conquer algorithm yields  O(n log n) Reminder: if n = 1,000,000 then Reminder: if n = 1,000,000 then n 2 = 1,000,000,000,000 whereas n 2 = 1,000,000,000,000 whereas n log n = 20,000,000 n log n = 20,000,000

11 Closest-Pair Algorithm Given: A set of points in 2-D

12 Closest-Pair Algorithm Step 1: Sort the points in one D

13 Lets sort based on the X-axis O(n log n) using quicksort or mergesort 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm

14 Step 2: Split the points, i.e., Draw a line at the mid-point between 7 and 8 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm Sub-Problem 1Sub-Problem 2

15 Advantage: Normally, we’d have to compare each of the 14 points with every other point. (n-1)n/2 = 13*14/2 = 91 comparisons 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm Sub-Problem 1Sub-Problem 2

16 Advantage: Now, we have two sub-problems of half the size. Thus, we have to do 6*7/2 comparisons twice, which is 42 comparisons 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm d1 d2 Sub-Problem 1Sub-Problem 2 solution d = min(d1, d2)

17 Advantage: With just one split we cut the number of comparisons in half. Obviously, we gain an even greater advantage if we split the sub-problems. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm d1 d2 Sub-Problem 1Sub-Problem 2 d = min(d1, d2)

18 Problem: However, what if the closest two points are each from different sub-problems? 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm d1 d2 Sub-Problem 1Sub-Problem 2

19 Here is an example where we have to compare points from sub-problem 1 to the points in sub- problem 2. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm d1 d2 Sub-Problem 1Sub-Problem 2

20 However, we only have to compare points inside the following “strip.” 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm d1 d2 Sub-Problem 1Sub-Problem 2 dd d = min(d1, d2)

21 Step 3: But, we can continue the advantage by splitting the sub-problems. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm

22 Step 3: In fact we can continue to split until each sub-problem is trivial, i.e., takes one comparison. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm

23 Finally: The solution to each sub-problem is combined until the final solution is obtained 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm

24 Finally: On the last step the ‘strip’ will likely be very small. Thus, combining the two largest sub- problems won’t require much work. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm

25 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Closest-Pair Algorithm In this example, it takes 22 comparisons to find the closets-pair. The brute force algorithm would have taken 91 comparisons. But, the real advantage occurs when there are millions of points.

26 Closest-Pair Problem: Divide and Conquer Here is another animation: Here is another animation: http://www.cs.mcgill.ca/~cs251/ClosestPair/Close stPairApplet/ClosestPairApplet.html http://www.cs.mcgill.ca/~cs251/ClosestPair/Close stPairApplet/ClosestPairApplet.html http://www.cs.mcgill.ca/~cs251/ClosestPair/Close stPairApplet/ClosestPairApplet.html http://www.cs.mcgill.ca/~cs251/ClosestPair/Close stPairApplet/ClosestPairApplet.html


Download ppt "Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances."

Similar presentations


Ads by Google