Download presentation
Presentation is loading. Please wait.
Published byRidwan Chandra Modified over 6 years ago
1
Divide and Conquer Merge sort and quick sort Binary search
Multiplication of large integers Strassen’s Matrix Multiplication Closest pair problem
2
Expected Outcomes Students should be able to
Explain the idea and steps of divide and conquer Explain the ideas of mergesort and quicksort Analyze the time complexity of mergesort and quicksort algorithms in the best, worst and average cases
3
Divide and Conquer A successful military strategy long before it became an algorithm design strategy Coalition uses divide-conquer plan in Fallujah, Iraq. By Rowan Scarborough and Bill Gertz, THE WASHINGTON TIMES Example: Your instructor give you a 500-question assignment today and ask you to turn it in the tomorrow. What should you do?
4
Three Steps of The Divide and Conquer Approach
The most well known algorithm design strategy Divide the problem into two or more smaller subproblems. Conquer the subproblems by solving them recursively. Combine the solutions to the subproblems into the solutions to the original problem.
5
A typical case of divide and conquer technique
6
An Example: Calculating a0 + a1 + … + an-1
Let us consider the problem of computing the sum of n numbers a0 + a1 + … + an-1. If n>1, we can divide the problem into two instances of the same problem: to compute the sum of first n/2 numbers and to compute sum of the remaining n/2 numbers. We can add these two values to get the sum in question: a0 + … + an-1 = (a0 + … + an/2 – 1 ) + (an/2 + … + an-1 ) it is worth keeping in mind that divide-conquer technique is ideally suited for parallel computations, in which each sub-problem can be solved simultaneously by its own processor.
7
An Example: Calculating a0 + a1 + … + an-1
More generally, an instance of size n can be divided into several instances of size n/b, with a of them needing to be solved. Here a and b are constants (a>=1 and b>1). Assuming that size n is power of b, to simplify our analysis we get the following recurrence for the running time T(n): T(n) = aT(n/b) + f(n), This equation is called the general divide-and-conquer recurrence. Where f(n) is function that accounts for the time spent on dividing the problem into smaller ones and on combining their solutions. The efficiency analysis of many divide-and-conquer algorithms is greatly simplified by the following theorem.
8
A(n) Θ(nlogba) = Θ(nlog22) = Θ(n).
Master Theorem The same results hold with O and Ω notations, too. For example, the recurrence equation for the number of additions A(n) made by the divide-and-conquer summation algorithm on inputs of size n = 2k is, A(n) = 2A(n/2) + 1. Thus, for this example, a=2, b=2, and d=0; hence, since a > bd, A(n) Θ(nlogba) = Θ(nlog22) = Θ(n).
9
Idea of Mergesort Mergesort is a perfect example of a successful application of the divide-and-conquer technique. Divide: divide array A[0..n-1] in two about equal halves and make copies of each half in arrays B and C Conquer: If number of elements in B and C is 1, directly solve it Sort arrays B and C recursively Combine: Merge sorted arrays B and C into array A Repeat the following until no elements remain in one of the arrays: compare the first elements in the remaining unprocessed portions of the arrays B and C copy the smaller of the two into A, while incrementing the index indicating the unprocessed portion of that array Once all elements in one of the arrays are processed, copy the remaining unprocessed elements from the other array into A.
11
The Mergesort Algorithm
12
The Merge Algorithm
13
Mergesort Examples The operation of the algorithm on the list is illustrated in figure below: Try to answer the following questions? How many key comparisons needed? How much extra memory needed? Is the algorithm an in-place algorithm? Is the algorithm stable?
15
Analysis of Mergesort Assuming for simplicity that n is a power of 2, the recurrence relation for the number of basic operations (key comparisons): T(n) = 2T(n/2) + (n) Worst case and best case T(n) = 2T(n/2) + n-1 T(n) = 2T(n/2) + n/2 All cases: Θ(n log n) Space complexity: (n), not in-place Stable
16
The Divide, Conquer and Combine Steps in Quicksort
Quicksort is another important sorting algorithm that is based on the divide-and-conquer approach. Unlike mergesort, which divides its input’s elements according to their position in the array, quicksort divides them according to their value. Divide: Partition array A[l..r] into 2 subarrays, A[l..s-1] and A[s+1..r] such that each element of the first array is ≤A[s] and each element of the second array is ≥ A[s]. (computing the index of s is part of partition.) Implication: A[s] will be in its final position in the sorted array. Conquer: Sort the two subarrays A[l..s-1] and A[s+1..r] by recursive calls to quicksort Combine: No work is needed, because A[s] is already in its correct place after the partition is done, and the two subarrays have been sorted.
17
Quicksort p A[i]p A[i]p
Select a pivot w.r.t. whose value we are going to divide the list. (typically, p = A[l]) Rearrange the list so that all the elements in the first s positions are smaller than or equal to the pivot and all the elements in the remaining n-s positions are larger than or equal to the pivot Exchange the pivot with the last element in the first sublist(i.e., ≤ sublist) – the pivot is now in its final position Sort the two sublists recursively using quicksort. p A[i]p A[i]p
18
The Quicksort Algorithm
ALGORITHM Quicksort(A[l..r]) //Sorts a subarray by quicksort //Input: A subarray A[l..r] of A[0..n-1],defined by its left and right indices l and r //Output: The subarray A[l..r] sorted in nondecreasing order if l < r s Partition (A[l..r]) // s is a split position Quicksort(A[l..s-1]) Quicksort(A[s+1..r]
19
Partitioning Algorithm
Can ‘=’ be removed from ‘’ and ‘’? Number of comparisons: n or n + 1
20
Quicksort Example Try to answer the following questions? How much extra memory needed? Is the algorithm an in-place algorithm? Is the algorithm stable? How many key comparisons needed? Should introduce an extra element with enough value for A[0, n-1], that is, A[n] = . Consider why?
21
Another Partitioning Algorithm
Algorithm Partition(A[l, r]) p = A[l ] PivotLoc = l for i = l+1 to r do if A[ i ] < p then PivotLoc = PivotLoc+ 1 if PivotLoc <> i then Swap( A[ PivotLoc ], A[ i ] ) end if end for Swap(A[ PivotLoc ], A[ l ] ) // move pivotinto correct place return PivotLoc Number of comparisons: n-1 for all
22
Efficiency of Quicksort
Based on whether the partitioning is balanced. Best case: split in the middle — Θ( n log n) C(n) = 2C(n/2) + Θ(n) //2 subproblems of size n/2 each Worst case: sorted array! — Θ( n2) C(n) = C(n-1) + Θ(n) //2 subproblems of size 0 and n-1 respectively Is there any other possible array with worst case comparisons? Average case: random arrays — Θ( n log n) By the second partition algorithm
23
Improvements Improvements:
better pivot selection: median-of-three partitioning switch to insertion sort on small subfiles elimination of recursion These combine to 20-25% improvement
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.