Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Divide-and-Conquer Approach Lecture 05 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer.

Similar presentations


Presentation on theme: "1 Divide-and-Conquer Approach Lecture 05 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer."— Presentation transcript:

1 1 Divide-and-Conquer Approach Lecture 05 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology Sirindhorn International Institute of Technology Thammasat University http://www.siit.tu.ac.th/bunyarit bunyarit@siit.tu.ac.th http://www.siit.tu.ac.th/bunyarit bunyarit@siit.tu.ac.th 02 5013505 X 2005 ITS033 – Programming & Algorithms

2 2 ITS033 Topic 01-Problems & Algorithmic Problem Solving Topic 01 - Problems & Algorithmic Problem Solving Topic 02 – Algorithm Representation & Efficiency Analysis Topic 03 - State Space of a problem Topic 04 - Brute Force Algorithm Topic 05 - Divide and Conquer Topic 06-Decrease and Conquer Topic 06 - Decrease and Conquer Topic 07 - Dynamics Programming Topic 08-Transform and Conquer Topic 08 - Transform and Conquer Topic 09 - Graph Algorithms Topic 10 - Minimum Spanning Tree Topic 11 - Shortest Path Problem Topic 12 - Coping with the Limitations of Algorithms Power http://www.siit.tu.ac.th/bunyarit/its033.php and http://www.vcharkarn.com/vlesson/showlesson.php?lessonid=7

3 3 This Week Overview Divide & Conquer Mergesort Quicksort Binary search Closest Pair by Divide and Conquer

4 4 Divide & Conquer: Introduction Lecture 05.0 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology Sirindhorn International Institute of Technology Thammasat University http://www.siit.tu.ac.th/bunyarit bunyarit@siit.tu.ac.th http://www.siit.tu.ac.th/bunyarit bunyarit@siit.tu.ac.th 02 5013505 X 2005 ITS033 – Programming & Algorithms

5 5 Introduction Divide-and-conquer algorithms work according to the following general plan: 1. A problem’s instance is divided into several smaller instances of the same problem, ideally of about the same size. 2. The smaller instances are solved (typically recursively, though sometimes a different algorithm is employed when instances become small enough). 3. If necessary, the solutions obtained for the smaller instances are combined to get a solution to the original problem.

6 6 Divide-and-Conquer Divide-and conquer is a general algorithm design paradigm:  Divide: divide the input data S in two disjoint subsets S 1 and S 2  Recur: solve the subproblems associated with S 1 and S 2  Conquer: combine the solutions for S 1 and S 2 into a solution for S The base case for the recursion are subproblems of size 0 or 1

7 7 Introduction

8 8 Not every divide-and-conquer algorithm is necessarily more efficient than even a brute-force solution. An instance of size n can be divided into several instances of size n/b, with a of them needing to be solved. (Here, a and b are constants; a = 1 and b > 1.). Assuming that size n is a power of b, to simplify our analysis, we get the following recurrence for the running time T (n): T (n) = aT (n/b) + f (n), (4.1) where f (n) is a function that accounts for the time spent on dividing the problem into smaller ones and on combining their solutions.

9 9 Introduction Recurrence (4.1) is called the general divideand- conquer recurrence. The order of growth of its solution T (n) depends on the values of the constants a and b and the order of growth of the function f (n)

10 10 Introduction

11 11 Introduction For example, the recurrence equation for the number of additions A(n) made by the divide- and-conquer summation algorithm on inputs of size n = 2 k is A(n) = 2A(n/2) + 1. Thus, for this example, a = 2, b = 2, and d = 0; hence, since a >b d, A(n) Є θ(nlog b a) = θ (nlog 2 2) = θ (n).

12 12 Advantages Solving difficult problems Divide and conquer is a powerful tool for solving conceptually difficult problems, all it requires is a way of breaking the problem into sub-problems, of solving the trivial cases and of combining sub-problems to the original problem. Algorithm efficiency Moreover, divide and conquer often provides a natural way to design efficient algorithms. Parallelism Divide and conquer algorithms are naturally adapted for execution in multi- processor machines. Memory access Divide-and-conquer algorithms naturally tend to make efficient use of memory caches. The reason is that once a sub-problem is small enough, it and all its sub- problems can, in principle, be solved within the cache, without accessing the slower main memory.

13 13 Divide & Conquer: Mergesort Lecture 05.1 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology Sirindhorn International Institute of Technology Thammasat University http://www.siit.tu.ac.th/bunyarit bunyarit@siit.tu.ac.th http://www.siit.tu.ac.th/bunyarit bunyarit@siit.tu.ac.th 02 5013505 X 2005 ITS033 – Programming & Algorithms

14 14 Introduction Sorting is the process of arranging a list of items into a particular order Sorting is the process of arranging a list of items into a particular order There must be some values on which the order is based There must be some values on which the order is based There are many algorithms for sorting a list of items There are many algorithms for sorting a list of items These algorithms vary in efficiency These algorithms vary in efficiency

15 15 Introduction Selection Sort => O(n 2 ) Selection Sort => O(n 2 ) Bubble Sort => O(n 2 ) Bubble Sort => O(n 2 )

16 16 Introduction if n=100, both of the above algorithms run approximately 100x100 = 10,000 comparison if n=100, both of the above algorithms run approximately 100x100 = 10,000 comparison However, if the input is divided to two N 2 /4 of (n/2)=50, then the total running time would be approximately However, if the input is divided to two N 2 /4 of (n/2)=50, then the total running time would be approximately = (n/2) 2 + (n/2) 2 = (n/2) 2 + (n/2) 2 = N 2 /4 + N 2 /4 = N 2 /4 + N 2 /4 = 2 (N 2 /4) = 2 (N 2 /4) = N 2 /2 = N 2 /2

17 17 Merge Sort the algorithm The strategy behind Merge Sort is to change the problem of sorting into the problem of merging two sorted sub-lists into one. The strategy behind Merge Sort is to change the problem of sorting into the problem of merging two sorted sub-lists into one. If the two halves of the array were sorted, then merging them carefully could complete the sort of the entire list. If the two halves of the array were sorted, then merging them carefully could complete the sort of the entire list.

18 18 Merge-Sort Merge-sort on an input sequence S with n elements consists of three steps:  Divide: partition S into two sequences S 1 and S 2 of about n  2 elements each  Recur: recursively sort S 1 and S 2  Conquer: merge S 1 and S 2 into a unique sorted sequence

19 19 Merge-Sort Algorithm mergeSort(S, C) Input sequence S with n elements Output sequence S sorted if size of S > 1 (S 1, S 2 )  partition(S, n/2) mergeSort(S 1 ) mergeSort(S 2 ) S  merge(S 1, S 2 )

20 20 4 Merge Sort the algorithm Merge Sort is a "recursive" algorithm because it accomplishes its task by calling itself on a smaller version of the problem (only half of the list). Merge Sort is a "recursive" algorithm because it accomplishes its task by calling itself on a smaller version of the problem (only half of the list). For example, if the array had 2 entries, Merge Sort would begin by calling itself for item 1. Since there is only one element, that sub-list is sorted and it can go on to call itself in item 2. For example, if the array had 2 entries, Merge Sort would begin by calling itself for item 1. Since there is only one element, that sub-list is sorted and it can go on to call itself in item 2. Since that also has only one item, it is sorted and now Merge Sort can merge those two sub-lists into one sorted list of size two. Since that also has only one item, it is sorted and now Merge Sort can merge those two sub-lists into one sorted list of size two.

21 21 Merging Two Sorted Sequences Algorithm merge(A, B) //Merges A and B, two sorted arrays into one sorted array, S Input sequences A and B Output sorted sequence of A  B S  empty sequence while A is not Empty and B is not Empty if A[current] < B.[current] Copy current elemet of A to S Move A to the next element else Copy current elemet of B to S Move B to the next element If A is still not Empty thencopy all of A to S If B is still not Empty then copy all of B to S return S

22 22 The real problem is how to merge the two sub-lists. The real problem is how to merge the two sub-lists. While it can be done in the original array, the algorithm is much simpler if it uses a separate array to hold the portion that has been merged and then copies the merged data back into the original array. While it can be done in the original array, the algorithm is much simpler if it uses a separate array to hold the portion that has been merged and then copies the merged data back into the original array. The basic philosophy of the merge is to determine which sub-list starts with the smallest data and copy that item into the merged list and move on to the next item in the sub-list. The basic philosophy of the merge is to determine which sub-list starts with the smallest data and copy that item into the merged list and move on to the next item in the sub-list. 4 Merge Sort the algorithm

23 23 Merging Two Sorted Sequences The conquer step of merge-sort consists of merging two sorted sequences A and B into a sorted sequence S containing the union of the elements of A and B Merging two sorted sequences, each with n  2 elements and implemented by means of a doubly linked list, (a special data structure) takes O(n) time

24 24 Merge-Sort Tree An execution of merge-sort is depicted by a binary tree  each node represents a recursive call of merge-sort and stores unsorted sequence before the execution and its partition sorted sequence at the end of the execution  the root is the initial call  the leaves are calls on subsequences of size 0 or 1 7 2  9 4  2 4 7 9 7  2  2 79  4  4 9 7  72  29  94  4

25 25 Execution Example Partition 7 2 9 4  2 4 7 93 8 6 1  1 3 8 67 2  2 79 4  4 93 8  3 86 1  1 67  72  29  94  43  38  86  61  1 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

26 26 Execution Example (cont.) Recursive call, partition 7 2  9 4  2 4 7 9 3 8 6 1  1 3 8 6 7 2  2 79 4  4 93 8  3 86 1  1 67  72  29  94  43  38  86  61  1 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

27 27 Execution Example (cont.) Recursive call, partition 7 2  9 4  2 4 7 93 8 6 1  1 3 8 6 7  2  2 7 9 4  4 93 8  3 86 1  1 6 7  72  29  94  43  38  86  61  1 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

28 28 Execution Example (cont.) Recursive call, base case 7 2  9 4  2 4 7 93 8 6 1  1 3 8 6 7  2  2 79 4  4 93 8  3 86 1  1 6 7  77  7 2  29  94  43  38  86  61  1 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

29 29 Execution Example (cont.) Recursive call, base case 7 2  9 4  2 4 7 93 8 6 1  1 3 8 6 7  2  2 79 4  4 93 8  3 86 1  1 6 7  77  72  22  29  94  43  38  86  61  1 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

30 30 Execution Example (cont.) Merge 7 2  9 4  2 4 7 93 8 6 1  1 3 8 6 7  2  2 7 9 4  4 93 8  3 86 1  1 6 7  77  72  22  29  94  43  38  86  61  1 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

31 31 Execution Example (cont.) Recursive call, …, base case, merge 7 2  9 4  2 4 7 93 8 6 1  1 3 8 6 7  2  2 7 9 4  4 9 3 8  3 86 1  1 6 7  77  72  22  23  38  86  61  1 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9 9  94  4

32 32 Execution Example (cont.) Merge 7 2  9 4  2 4 7 9 3 8 6 1  1 3 8 6 7  2  2 79 4  4 93 8  3 86 1  1 6 7  77  72  22  29  94  43  38  86  61  1 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

33 33 Execution Example (cont.) Recursive call, …, merge, merge 7 2  9 4  2 4 7 9 3 8 6 1  1 3 6 8 7  2  2 79 4  4 93 8  3 86 1  1 6 7  77  72  22  29  94  43  33  38  88  86  66  61  11  1 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

34 34 Execution Example (cont.) Merge 7 2  9 4  2 4 7 93 8 6 1  1 3 6 8 7  2  2 79 4  4 93 8  3 86 1  1 6 7  77  72  22  29  94  43  33  38  88  86  66  61  11  1 7 2 9 4  3 8 6 1  1 2 3 4 6 7 8 9

35 35 auxiliary array smallest AGLORHIMST Merging Merge.  Keep track of smallest element in each sorted half.  Insert smallest of two elements into auxiliary array.  Repeat until done. A

36 36 auxiliary array smallest AGLORHIMST A Merging Merge.  Keep track of smallest element in each sorted half.  Insert smallest of two elements into auxiliary array.  Repeat until done. G

37 37 auxiliary array smallest AGLORHIMST AG Merging Merge.  Keep track of smallest element in each sorted half.  Insert smallest of two elements into auxiliary array.  Repeat until done. H

38 38 auxiliary array smallest AGLORHIMST AGH Merging Merge.  Keep track of smallest element in each sorted half.  Insert smallest of two elements into auxiliary array.  Repeat until done. I

39 39 auxiliary array smallest AGLORHIMST AGHI Merging Merge.  Keep track of smallest element in each sorted half.  Insert smallest of two elements into auxiliary array.  Repeat until done. L

40 40 auxiliary array smallest AGLORHIMST AGHIL Merging Merge.  Keep track of smallest element in each sorted half.  Insert smallest of two elements into auxiliary array.  Repeat until done. M

41 41 auxiliary array smallest AGLORHIMST AGHILM Merging Merge.  Keep track of smallest element in each sorted half.  Insert smallest of two elements into auxiliary array.  Repeat until done. O

42 42 auxiliary array smallest AGLORHIMST AGHILMO Merging Merge.  Keep track of smallest element in each sorted half.  Insert smallest of two elements into auxiliary array.  Repeat until done. R

43 43 auxiliary array first half exhausted smallest AGLORHIMST AGHILMOR Merging Merge.  Keep track of smallest element in each sorted half.  Insert smallest of two elements into auxiliary array.  Repeat until done. S

44 44 auxiliary array first half exhausted smallest AGLORHIMST AGHILMORS Merging Merge.  Keep track of smallest element in each sorted half.  Insert smallest of two elements into auxiliary array.  Repeat until done. T

45 45 auxiliary array first half exhausted second half exhausted AGLORHIMST AGHILMORST Merging Merge.  Keep track of smallest element in each sorted half.  Insert smallest of two elements into auxiliary array.  Repeat until done.

46 46 Analysis of Merge-Sort The height h of the merge-sort tree is O(log n)  at each recursive call we divide in half the sequence, The overall amount or work done at the nodes of depth i is O(n)  we partition and merge 2 i sequences of size n  2 i  we make 2 i  1 recursive calls Thus, the total running time of merge-sort is O(n log n) depth#seqssize 01n 12 n2n2 i2i2i n2in2i ………

47 47 Merge Sort the analysis MergeSort is a classic example of the techniques used to analyze recursive routines: MergeSort is a classic example of the techniques used to analyze recursive routines: Merge Sort is a divide-and-conquer recursive algorithm. Merge Sort is a divide-and-conquer recursive algorithm. MergeSort’s running time time is O(N log N). MergeSort’s running time time is O(N log N).

48 48 Merge Sort the analysis Although its running time is O(N log N), it is hardly ever used for the main memory sorts that’s because it consumes a lot of memory. Although its running time is O(N log N), it is hardly ever used for the main memory sorts that’s because it consumes a lot of memory. The main problem is that merging two sorted lists requires linear extra memory, and the additional work spent copying to the temporary array and back throughout the algorithm. The main problem is that merging two sorted lists requires linear extra memory, and the additional work spent copying to the temporary array and back throughout the algorithm. Which effect in slowing down the sort. Which effect in slowing down the sort. The principal shortcoming of mergesort is the linear amount of extra storage the algorithm requires. Though merging can be done in place, the resulting algorithm is quite complicated and, since it has a significantly larger multiplicative constant, the in-place mergesort is of theoretical interest only.

49 49 Divide & Conquer: Quicksort Lecture 05.2 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology Sirindhorn International Institute of Technology Thammasat University http://www.siit.tu.ac.th/bunyarit bunyarit@siit.tu.ac.th http://www.siit.tu.ac.th/bunyarit bunyarit@siit.tu.ac.th 02 5013505 X 2005 ITS033 – Programming & Algorithms

50 50 Quick Sort the algorithm Quick Sort's approach is to take Merge Sort's philosophy but eliminate the need for the merging steps. Quick Sort's approach is to take Merge Sort's philosophy but eliminate the need for the merging steps. Can you see how the problem could be solved ? Can you see how the problem could be solved ?

51 51 Quick Sort the algorithm It makes sure that every data item in the first sub-list is less than every data item in the second sub-list. It makes sure that every data item in the first sub-list is less than every data item in the second sub-list. The procedure that accomplished that is called "partitioning" the data. After the paritioning,,each of the sub-lists are sorted, which will cause the entire array to be sorted. The procedure that accomplished that is called "partitioning" the data. After the paritioning,,each of the sub-lists are sorted, which will cause the entire array to be sorted.

52 52 Quick-Sort Quick-sort is a randomized sorting algorithm based on the divide-and- conquer paradigm:  Divide: pick a right-most element x (called pivot) and partition S into L elements less than x E elements equal x G elements greater than x  Recur: sort L and G  Conquer: join L, E and G x x L G E x

53 53 Quick Sort Quick sort divides the inputs according to their value to achieve its partition: Then it partition the inputs to partition that have greater value than pivot, and partition that have smaller value than pivot. Pivot

54 54 Quick Sort ALGORITHM Quicksort(A[l..r]) //Sorts a subarray by quicksort //Input: A subarray A[l..r] of A[0..n - 1], defined by its left // and right indices l and r //Output: The subarray A[l..r] sorted in nondecreasing order if l < r s  Partition(A[l..r]) //s is a split position Quicksort(A[l..s - 1]) Quicksort(A[s + 1..r])

55 55 The hard part of Quick Sort is the partitioning. The hard part of Quick Sort is the partitioning. Algorithm looks at the first element of the array (called the "pivot"). It will put all of the elements which are less than the pivot in the lower portion of the array and the elements higher than the pivot in the upper portion of the array. When that is complete, it can put the pivot between those sections and Quick Sort will be able to sort the two sections separately. Algorithm looks at the first element of the array (called the "pivot"). It will put all of the elements which are less than the pivot in the lower portion of the array and the elements higher than the pivot in the upper portion of the array. When that is complete, it can put the pivot between those sections and Quick Sort will be able to sort the two sections separately. QuickSort the algorithm

56 56 Partition We partition an input sequence as follows:  We remove, in turn, each element y from S and  We insert y into L, E or G, depending on the result of the comparison with the pivot x Each insertion and removal is at the beginning or at the end of a sequence, and hence takes O(1) time Thus, the partition step of quick-sort takes O(n) time

57 57 Partition Procedure ALGORITHM Partition(A[l..r]) //Partitions subarray by using its first element as a pivot //Input: subarray A[l..r] of A[0..n - 1], defined by its left and right indices l and r (l< r) //Output: A partition of A[l..r], with the split position returned as this function’s value p  A[l] i .l; j  r + 1 repeat repeat i .i + 1 until A[i] ≥ p repeat j.j  1 until A[j ] ≤ p swap(A[i], A[j ]) until i ≥ j swap(A[i], A[j ]) //undo last swap when i ≥ j swap(A[l], A[j ]) return j

58 58 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross QUICKSORTISCOOL partitioned partition elementleft right unpartitioned

59 59 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross swap me partitioned partition elementleft right unpartitioned QUICKSORTISCOOL

60 60 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned swap me QUICKSORTISCOOL

61 61 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned swap me QUICKSORTISCOOL

62 62 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned swap me QUICKSORTISCOOL

63 63 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned CUICKSORTISQOOL

64 64 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross swap me partitioned partition elementleft right unpartitioned CUICKSORTISQOOL

65 65 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned swap me CUICKSORTISQOOL

66 66 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned swap me CUICKSORTISQOOL

67 67 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned CIICKSORTUSQOOL

68 68 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned CIICKSORTUSQOOL

69 69 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned CIICKSORTUSQOOL

70 70 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned CIICKSORTUSQOOL

71 71 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross swap me partitioned partition elementleft right unpartitioned CIICKSORTUSQOOL

72 72 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned swap me CIICKSORTUSQOOL

73 73 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned swap me CIICKSORTUSQOOL

74 74 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned swap me CIICKSORTUSQOOL

75 75 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross pointers cross swap with partitioning element partitioned partition elementleft right unpartitioned CIICKSORTUSQOOL

76 76 Partitioning in Quicksort  How do we partition the array efficiently? choose partition element to be rightmost element scan from left for larger element scan from right for smaller element exchange repeat until pointers cross partitioned partition elementleft right unpartitioned partition is complete CIICKLORTUSQOOS

77 77 Quick Sort the analysis Like Merge Sort, QuickSort is a divide-and-conquer recursive algorithm. Like Merge Sort, QuickSort is a divide-and-conquer recursive algorithm. QuickSort is the fastest known sorting algorithm in practice. QuickSort is the fastest known sorting algorithm in practice. Its average running time is O(N log N). Its average running time is O(N log N). However, it has O(N 2 ) for worst-case performance. However, it has O(N 2 ) for worst-case performance. On the average, quicksort makes only 38% more comparisons than in the best case. Moreover, its innermost loop is so efficient that it runs faster than mergesort

78 78 Summary of Sorting Algorithms AlgorithmTimeNotes Bubble-sort O(n2)O(n2) in-place slow (good for small inputs) Selection-sort O(n2)O(n2) in-place slow (good for small inputs) merge-sort O(n log n) sequential data access fast (good for huge inputs) quick-sort O(n log n) expected in-place, randomized fastest (good for large inputs)

79 79 Divide & Conquer: Binary Search Lecture 05.3 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology Sirindhorn International Institute of Technology Thammasat University http://www.siit.tu.ac.th/bunyarit bunyarit@siit.tu.ac.th http://www.siit.tu.ac.th/bunyarit bunyarit@siit.tu.ac.th 02 5013505 X 2005 ITS033 – Programming & Algorithms

80 80 Binary Search is an incredibly powerful technique for searching an ordered list. Binary Search is an incredibly powerful technique for searching an ordered list. The basic algorithm is to find the middle element of the list, compare it against the key, decide which half of the list must contain the key, and repeat with that half. The basic algorithm is to find the middle element of the list, compare it against the key, decide which half of the list must contain the key, and repeat with that half. Binary search

81 81 Binary Search It works by comparing a search key K with the array’s middle element A[m]. If they match, the algorithm stops; otherwise, the same operation is repeated recursively for the first half of the array ifK A[m]:

82 82 Binary search Algorithm BinarySearchOnSorted Input: an array, a key Output: Location of the key 1. Sort the array (smallest to biggest) 2. start with the middle element of the array If it matches then done If middle element > key then search array’s 1st half If middle element < key then search array’s 2nd half

83 83 82134657 index 109111214130 6414132533514353 value 8472939597966 Maintain array of Items. Store in sorted order. Use binary search to FIND Item with Key = 33.

84 84 82134657 index 109111214130 6414132533514353 value 8472939597966 rightleft if Key v is in array, it is has index between left and right.

85 85 82134657 index 109111214130 6414132533514353 value 8472939597966 rightleftmid Compute midpoint and check if matching Key is in that position.

86 86 82134657 index 109111214130 6414132533514353 value 8472939597966 rightleftmid Since 33 < 53, can reduce search interval.

87 87 82134657 index 109111214130 6414132533514353 value 8472939597966 rightleft Since 33 < 53, can reduce search interval.

88 88 82134657 index 109111214130 6414132533514353 value 8472939597966 rightleftmid Compute midpoint and check if matching Key is in that position.

89 89 82134657 index 109111214130 6414132533514353 value 8472939597966 rightleftmid Since 33 > 25, can reduce search interval.

90 90 82134657 index 109111214130 6414132533514353 value 8472939597966 rightleft Since 33 > 25, can reduce search interval.

91 91 82134657 index 109111214130 6414132533514353 value 8472939597966 rightleft mid

92 92 82134657 index 109111214130 6414132533514353 value 8472939597966 left right Compute midpoint and check if matching Key is in that position.

93 93 82134657 index 109111214130 6414132533514353 value 8472939597966 left right Matching Key found. Return database index 4.

94 94 Binary search Found = false while (not Found) and (Left<=Right) do { Mid = (Left+Right)/2; if Key==Array[Mid] then Found=true; else if Key < Array[Mid] then Right = Mid-1; else if Key > Array[Mid] then Left = Mid+1; }

95 95 Binary search Binary search is O(log 2 n) It can find a key from 256 items for 8 comparisons It can find a key from 1,000,000 items for under 20 comparisons It can find a key from 1,000,000,000 items for under 30 comparisons – that’s ‘efficiency’

96 96 Divide & Conquer: Closest Pair Problem Lecture 05.4 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology Sirindhorn International Institute of Technology Thammasat University http://www.siit.tu.ac.th/bunyarit bunyarit@siit.tu.ac.th http://www.siit.tu.ac.th/bunyarit bunyarit@siit.tu.ac.th 02 5013505 X 2005 ITS033 – Programming & Algorithms

97 97 Closest-Pair Problems by Divide-and- Conquer Closest-Pair Problem Let P 1 = (x 1, y 1 ),..., P n = (x n, y n ) be a set S of n points in the plane, where n, for simplicity, is a power of two. We can divide the points given into two subsets S1 and S2 of n/2 points each by drawing a vertical line x = c. Thus, n/2 points lie to the left of or on the line itself and n/2 points lie to the right of or on the line.

98 98 Closest-Pair Problem Following the divide-and-conquer approach, we can find recursively the closest pairs for the left subset S1 and the right subset S2. Let d1 and d2 be the smallest distances between pairs of points in S1 and S2, respectively, and let d = min{d1, d2}. Unfortunately, d is not necessarily the smallest distance between all pairs of points in S1 and S2 because a closer pair of points can lie on the opposite sides of the separating line. So, as a step of combining the solutions to the smaller subproblems, we need to examine such points.

99 99 Closest-Pair Problem Idea of the divide-and-conquer algorithm for the closest-pair problem.

100 100 Closest-Pair Problem Worst case example: The six points that may need to be examined for point P. The running time of this algorithm on n presorted points: T (n) = 2T (n/2) +M(n). Applying the O version of the Master Theorem: T (n) Є O(n log n). The possible necessity to presort input points does not change the overall efficiency class if sorting is done by a O(n log n) algorithm.

101 101 ITS033 Topic 01-Problems & Algorithmic Problem Solving Topic 01 - Problems & Algorithmic Problem Solving Topic 02 – Algorithm Representation & Efficiency Analysis Topic 03 - State Space of a problem Topic 04 - Brute Force Algorithm Topic 05 - Divide and Conquer Topic 06-Decrease and Conquer Topic 06 - Decrease and Conquer Topic 07 - Dynamics Programming Topic 08-Transform and Conquer Topic 08 - Transform and Conquer Topic 09 - Graph Algorithms Topic 10 - Minimum Spanning Tree Topic 11 - Shortest Path Problem Topic 12 - Coping with the Limitations of Algorithms Power http://www.siit.tu.ac.th/bunyarit/its033.php and http://www.vcharkarn.com/vlesson/showlesson.php?lessonid=7

102 102 End of Chapter 4 Thank you!


Download ppt "1 Divide-and-Conquer Approach Lecture 05 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer."

Similar presentations


Ads by Google