Presentation is loading. Please wait.

Presentation is loading. Please wait.

Instructor Neelima Gupta Introduction to some tools to designing algorithms through Sorting Iterative Divide and Conquer.

Similar presentations


Presentation on theme: "Instructor Neelima Gupta Introduction to some tools to designing algorithms through Sorting Iterative Divide and Conquer."— Presentation transcript:

1 Instructor Neelima Gupta ngupta@cs.du.ac.in

2 Introduction to some tools to designing algorithms through Sorting Iterative Divide and Conquer

3 Iterative Algorithms: Insertion Sort – an example x 1,x 2,........., x i-1,x i,.......…,x n For I = 2 to n Insert the ith element x i in the partially sorted list x 1,x 2,........., x i-1. (at r th position)

4 An Example: Insertion Sort 158710 1234 125 56 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } At Iteration 1: key = 8 Thanks Brijesh Kumar (08) : MCA -12

5 An Example: Insertion Sort 815710 1234 125 56 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } At Iteration 2: key = 7 Thanks Brijesh Kumar (08) : MCA -12

6 An Example: Insertion Sort 781510 1234 125 56 At Iteration 3: key = 10 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } Thanks Brijesh Kumar (08) : MCA -12

7 An Example: Insertion Sort 781015 1234 125 56 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } At Iteration 4: key = 12 Thanks Brijesh Kumar (08) : MCA -12

8 An Example: Insertion Sort 781012 1234 155 56 At Iteration 5: key = 5 InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } Thanks Brijesh Kumar (08) : MCA -12

9 An Example: Insertion Sort 57810 1234 1215 56 Final Output InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } Thanks Brijesh Kumar (08) : MCA -12

10 Analysis: Insertion Sort Thanks : MCA 2012 Dharam Deo Prasad InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } A[j+1] = key } }

11 Statement C N InsertionSort(A, n) { for i = 2 to n { c 1 n key = A[i]c 2 (n-1) j = i - 1;c 3 (n-1) while (j > 0) and (A[j] > key)c 4 Σ(T i + 1) { A[j+1] = A[j]c 5 Σ T i j = j - 1c 6 Σ T i } A[j+1] = keyc 7 (n-1) } where T i is number of while expression evaluations for the i th for loop iteration C i is the constant time required for 1 execution of the statement N is the number of times the statement is executed Running Time Analysis Thanks : MCA 2012 Dharam Deo Prasad

12 Total time Thanks : MCA 2012 Dharam Deo Prasad T(n) = ( c 1 + c 2 + c 3 + c 7 )n – (c 2 + c 3 + c 7 ) + [(c 4 + c 5 + c 6 ) T i + c 4 ] n i=2 ∑

13 Worst Case Thanks : MCA 2012 Dharam Deo Prasad Worst case:T i = i – 1 i.e. T i = (i – 1) = n(n-1)/2 hence, T(n) = (c 1 + c 2 + c 3 + c 4 + c 7 )n – (c 2 +c 3 + c 4 + c 7 ) + (c 4 + c 5 + c 6 )n(n-1/2) = an 2 + bn + c where, a = 1/2 (c 4 + c 5 + c 6 ) b = -1/2 (c 4 + c 5 + c 6 ) + (c 1 + c 2 + c 3 + c 4 + c 7 ) c = -(c 2 + c 3 + c 4 + c 7 ) n i=2 ∑ n ∑

14 Best Case Thanks : MCA 2012 Dharam Deo Prasad Best case:T i = 1 i.e. T i = 1 =(n – 1) hence, T(n) = (c 1 + c 2 + c 3 + c 4 + c 7 )n – (c 2 +c 3 + c 4 + c 7 ) + (c 4 + c 5 + c 6 )(n- 1) = an + b where, a = c 1 + c 2 + c 3 + 2c 4 + c 5 + c 6 + c 7 b = -(c 2 + c 3 + 2c 4 + c 5 + c 6 + c 7 ) n i=2 ∑ n ∑

15 Analysis of Algorithms Before we move ahead, let us define the notion of analysis of algorithms more formally

16 Input Size Time and space complexity This is generally a function of the input size How we characterize input size depends: Sorting: number of input items Multiplication: total number of bits Graph algorithms: number of nodes & edges Etc

17 Lower Bounds Please understand the following statements carefully. Any algorithm that sorts by removing at most one inversion per comparison does at least n(n-1)/2 comparisons in the worst case. Hence Insertion Sort is optimal in this category of algorithms.

18 Optimal? What do you mean by the term “ Optimal ” ? Answer: If an algorithm runs as fast in the worst case as it is possible (in the best case), we say that the algorithm is optimal. i.e if the worst case performance of an algorithm matches the lower bound, the algorithm is said to be “ optimal ”

19 Inversion :- Example : 4, 2, 3 No. of pairs = 3 C 2, out of which (4,2) is out of order i.e. inversion (2,3) is in order (4,3) inversion Thanks to:Dileep Jaiswal (11) :MCA 2012

20 In ‘n’ elements there will be n(n-1)/2 inversions in worst case. Thus, if an algorithm sorts by removing at most one inversion per comparison then it must do at least n(n- 1)/2 comparisons in the worst case. Thanks to:Dileep Jaiswal (11) :MCA 2012

21 Insertion Sort x 1,x 2,…...,x k-1, x k,...........…..,x i-1, x i Let x i is inserted after x k-1 No. of comparisons = (i-1) – (k – 1) +1 = i – k + 1 No. of inversions removed = (i-1) – (k – 1) = i – k No. of inversions removed/comparison = (1-k+1)/(i-k) < = 1 Thanks to:Dileep Jaiswal (11) :MCA 2012

22 Thus Insertion sort falls in the category of comparison algorithms that remove at most one inversion per comparison. Insertion sort is optimal in this category of algorithms. Thanks to:Dileep Jaiswal (11) :MCA 2012

23 SELECTION SORT The algorithm works as follows: Find the maximum value in the array. Swap it with the value in the last position Repeat the steps above for the remainder of the array. Thanks to: MCA 2012 Chhaya Nathani(9)

24 Selection Sort : An Example  For i = n to 2 a) Select the maximum in a[1].......a[i]. b) Swap it with a[i]. Thanks to: MCA 2012 Chhaya Nathani(9)

25 Selection Sort x 1,x 2,…, x k,...........……, x n Let the maximum is located at x k Swap it with x n Continue Thanks to:Dileep Jaiswal (11) :MCA 2012

26 Selection Sort x 1,x 2,…, x k,...........……, x n-i+1,…… x n Suppose we are at the ith iteration Let the maximum is located at x k Swap it with x n-i+1 Thanks to:Dileep Jaiswal (11) :MCA 2012

27 52064153102 Thanks to: MCA 2012 Chhaya Nathani(9) An Example: Selection Sort

28 52641531020 An Example: Selection Sort Thanks to: MCA 2012 Chhaya Nathani(9)

29 52641031520 An Example: Selection Sort Thanks to: MCA 2012 Chhaya Nathani(9)

30 52643101520 An Example: Selection Sort

31 Thanks to: MCA 2012 Chhaya Nathani(9) 52346101520 An Example: Selection Sort

32 Thanks to: MCA 2012 Chhaya Nathani(9) 42356101520 An Example: Selection Sort

33 Thanks to: MCA 2012 Chhaya Nathani(9) 32456101520 An Example: Selection Sort

34 Thanks to: MCA 2012 Chhaya Nathani(9) 23456101520 DONE!!!! An Example: Selection Sort

35 SelectionSort(A, n) { for i = n to 2 { max=i for j=i-1 to 1 {If a[j]>a[max] Max=j } If(max!=i){ Swap a[i]……a[max]} Thanks to: MCA 2012 Chhaya Nathani(9)

36 ANALYSIS: SELECTION SORT Selecting the largest element requires scanning all n elements (this takes n − 1 comparisons) and then swapping it into the last position. Finding the next largest element requires scanning the remaining n − 1 elements and so on... (n − 1) + (n − 2) +... + 2 + 1 = n(n − 1) / 2 i.e Θ(n 2 ) comparisons T(n)= Θ(n 2 ) Thanks to: MCA 2012 Chhaya Nathani(9)

37 Selection Sort x 1,x 2,…, x k,...........……, x n-i+1,…… x n Suppose we are at the ith iteration Let the maximum is located at x k No. of comparisons = (n – i + 1) - 1 = n - i No. of inversions removed = (n – i + 1) – 1 – (k-1) = n – i – k + 1 No. of inversions removed/comparison = ( n – i – k +1)/ (n – i) = 1) Thanks to:Dileep Jaiswal (11) :MCA 2012

38 Thus Selection sort also falls in category of comparison algorithms that remove at most one inversion per comparison. Selection sort is also optimal in this category of algorithms. Thanks to:Dileep Jaiswal (11) :MCA 2012

39 Merge Sort (Divide and Conquer Technique)  Divide the list into nearly equal halves  Sort each half recursively  Merge these lists Thanks to:Gaurav Gulzar (MCA -11)

40 1891513201075 1891513201075 189 1513 91513201075 201075 Divide into 2 Subsequence Thanks to Himanshu (MCA 2012, Roll No 14)

41 Next Sort the 2 subsequences and merge them. Thanks to Himanshu (MCA 2012, Roll No 14)

42 5791013151820 9131518571020 918 1315 9 13201075 2057 Sort & merge 2 Subsequence Initial Subsequence Sorted Sequence Thanks to Himanshu (MCA 2012, Roll No 14)

43 Merging Let we have two sorted lists : A: B: Compare a 1 with b 1 and put smaller one in new array C and increase the index of array having smaller element. Thanks to:Gaurav Gulzar (MCA -11) a1a1 a2a2 a3a3 ……………. anan b1b1 b2b2 b3b3 bmbm

44 Let a 1 < b 1 A: B: Sorted Array C C: Thanks to:Gaurav Gulzar (MCA -11) a1a1 a2a2 a3a3 ……………. anan b1b1 b2b2 b3b3 bmbm a1a1

45 Example 1: A: B: Sorted Array C C: 2040 103035 38 45 1020 5052 Thanks to:Gaurav Gulzar (MCA -11) 6080

46 Example 1: A: B: Sorted Array C C: 2040 103035 38 45 10203035 5052 Thanks to:Gaurav Gulzar (MCA -11) 6080

47 Example 1: A: B: Sorted Array C C: 2040 103035 38 45 10203035 5052 38 Thanks to:Gaurav Gulzar (MCA -11) 6080

48 Example 1: A: B: Sorted Array C C: 2040 103035 38 45 10203035 5052 384045 Thanks to:Gaurav Gulzar (MCA -11) 6080

49 Example 1: A: B: Sorted Array C C: 2040 103035 38 45 10203035 5052 38404550 Thanks to:Gaurav Gulzar (MCA -11) 6080

50 Example 1: A: B: Sorted Array C C: 2040 103035 38 45 10203035 5052 3840455052 Thanks to:Gaurav Gulzar (MCA -11) 6080

51 Example 1: A: B: Sorted Array C C: 2040 103035 38 45 10203035 5052 3840455052 Thanks to:Gaurav Gulzar (MCA -11) 6080 6080

52 Worst Case Analysis If no. of elements in first list is ‘n’ & in second list is ‘m’ then: Total No. of comparisons = n-1+m i.e. O (m + n) in worst case and Thanks to:Gaurav Gulzar (MCA -11)

53 What is the best case? Number of Comparisons in the best case: min(n, m), i.e. Ω(min(n, m)) Thanks to:Gaurav Gulzar (MCA -11)

54 Example 2: A: B: Sorted Array C C: 58 103035 38 45 58 5052 Thanks to:Gaurav Gulzar (MCA -11)

55 Example 2: A: B: Sorted Array C C: 58 103035 38 45 581030 5052 Thanks to:Gaurav Gulzar (MCA -11)

56 Example 2: A: B: Sorted Array C C: 58 103035 38 45 581030 5052 35 Thanks to:Gaurav Gulzar (MCA -11)

57 Example 2: A: B: Sorted Array C C: 58 103035 38 45 581030 5052 3538 Thanks to:Gaurav Gulzar (MCA -11)

58 Example 2: A: B: Sorted Array C C: 58 103035 38 45 581030 5052 353845 Thanks to:Gaurav Gulzar (MCA -11)

59 Example 2: A: B: Sorted Array C C: 58 103035 38 45 581030 5052 35384550 Thanks to:Gaurav Gulzar (MCA -11)

60 Example 2: A: B: Sorted Array C C: 58 103035 38 45 581030 5052 3538455052 Thanks to:Gaurav Gulzar (MCA -11)

61 What is the best case? Arrays Total number of steps: Ω(m+n)…copying the rest of the elements of the bigger array. Linked List : Total number of steps: min(n, m), i.e. Ω(min(n, m)) in best case.

62 Analysis of Merge Sort Since size of both the lists to be merged is n/2, in either case (arrays or linked list), time to merge the two lists is Θ(n). If T(n) = no. of comparisons performed on an Input of size ‘n’ then : T(n) = T(n/2) + T(n/2) + Θ(n) = 2T(n/2) +cn ∀ n>= n 0 ∴ T(n) = O (nlogn) Thanks to:Gaurav Gulzar (MCA -11)

63 If T(n) = no. of comparisons performed on an Input of size ‘n’ then : T(n) = T(n/2) + T(n/2) + Θ(n) = 2T(n/2) +cn ∀ n>= n 0 ∴ T(n) = O (nlogn) Thanks to:Gaurav Gulzar (MCA -11)

64 Final Points Merging is Θ(m + n) in case of an Array If we use link list then it is Ω(min(m, n)) Time for merging is O(n) and Ω(n/2) i.e. Θ(n) Thanks to:Gaurav Gulzar (MCA -11)

65 Conclusion Merge Sort = Θ(nlogn) Have we beaten the lower bound? No, It just means that merge sort does not fall in the previous category of algorithms. It removes > 1 inversions per comparisons. Thanks to:Gaurav Gulzar (MCA -11)

66 What is the Worst case Best Case for Merge Sort?

67 Merge Sort Vs Insertion Sort What is the advantage of merge sort? What is the advantage of insertion sort?

68 Merge Sort Vs Insertion Sort contd.. Merge Sort is faster but does not take advantage if the list is already partially sorted. Insertion Sort takes advantage if the list is already partially sorted.

69 Lower Bound Any algorithm that sorts by comparison only does at least  (n lg n) comparisons in the worst case.

70 Decision Trees Provides an abstraction of comparison sorts. In a decision tree, each node represents a comparison. Insertion sort applied on x1, x2, x3 x1:x2 x2:x3 x1:x3 x3:x1 x1<x2<x3 x3>x1>x2 x2:x3 x2>x1>x3x2>x3>x1 x1>x2>x3 x1>x3>x2 x2>x 3 x1<x2 x3>x2 x3<x1 x3>x1 x1>x2 x3>x1 x1>x3 x2>x3 x3>x2

71 What is the minimum number of leaves in a decision tree? Longest path of the tree gives us the height of the tree, and actually represents the worst case scenario for the algorithm. height of a decision tree = Ω (n log n) i.e. any comparison sort will perform at least (n logn) comparisons in the worst case. Decision Trees

72 Proof : Let h denotes the height of the tree. What’s the maximum # of leaves of a binary tree of height h? : 2 h 2 h >= number of leaves >= n! (where n = no. of elements and n! is a lower bound on the no. of leaves in the decision tree) => h>= log(n!) > n log n ( By Stirling’s Approximation) ( SA: n!= √ (2.π.n).(n/e) n > (n/e) n Thus, log(n!) > n log n – n log e > n logn. )

73 Merge Sort is Optimal Thus the time to comparison sort n elements is  (n lg n) Corollary: Merge-sort is asymptotically optimal comparison sorts. Later we’ll see another sorting algorithm in this category namely heap-sort that is also optimal. We’ll also see some algorithms which beat this bound. Needless to say those algorithms are not purely based on comparisons. They do something extra.

74 Quick Sort (Divide and Conquer Technique)  Pick a pivot element x  Partition the array into two subarrays around the pivot x such that elements in left subarray is less than equal to x and element in right subarray is greater than x  Recursively sort left subarray and right subarray Thanks to:Krishn Kant Kundan (MCA -19) x ≤ x ≤ x>x

75 Quick Sort (Algorithm) QUICKSORT(A, p, q) if p < q k=PARTITION(A, p, q) QUICKSORT(A, p, k-1) QUICKSORT(A, k+1, q) Thanks to:Krishn Kant Kundan (MCA -19)

76 Quick Sort (Example) Let we have following elements in our array :- i j Thanks to:Krishn Kant Kundan (MCA -19) 27149228415631155399113024 14279228415631155399113024 14927228415631155399113024

77 Quick Sort (Example) i j Thanks to:Krishn Kant Kundan (MCA -19) 14922278415631155399113024 14922827415631155399113024 14922827245631155399113041

78 Quick Sort (Example) i j Thanks to:Krishn Kant Kundan (MCA -19) 14922824275631155399113041 14922824273031155399115641 14922824271131155399305641

79 Quick Sort (Example) i j Thanks to:Krishn Kant Kundan (MCA -19) 14922824112731155399305641 14922824112799155331305641 14922824112753159931305641

80 Quick Sort (Example) i j i j (stop) (recursive call on left array) (recursive call on right array) Thanks to:Krishn Kant Kundan (MCA -19) 14922824112715539931305641 14922824111527539931305641 14922824111527539931305641

81 i j i j i j i j i j i ji j (stop) i j (st op) 149228241115 53993130564127 91422824111527534131305699 91415824112227415331305699 91411824152227413153305699 91114824152227413130535699 91181424152227413130535699 911814241522 27 413130535699

82 91181424152227413130535699 i j ij i j i j 98111415242227314130535699 i j ij i j i j (stop) 89111415222427313041535699 (stop) i j i j (stop) 89111415222427313041535699 Sorted Array Thanks to:Krishn Kant Kundan (MCA -19)

83 Analyzing Quicksort Worst Case? Partition is always unbalanced Worst Case input? Already-sorted input, if the first element is always picked as the pivot Best case? Partition is perfectly balanced Best Case input? ? Worst Case and Best Case input when the middle element is always picked as the pivot?

84 Worst Case of Quicksort In the worst case: T(1) =  (1) T(n) = T(n - 1) +  (n) Does the recursion look familiar? T(n) =  (n 2 )

85 Best Case of Quicksort In the best case: T(n) = 2T(n/2) +  (n) Does the recursion familiar? T(n) =  (n lg n)

86 Why does Qsort works well in practice? Suppose that partition() always produces a 9-to-1 split. This looks quite unbalanced! The recurrence is: T(n) = T(9n/10) + T(n/10) + n T(n) = θ (n log n) Such an imbalanced partition and θ(n log n) time?

87 Why does Qsort works well in practice? Intuitively, a real-life run of quicksort will produce a mix of “ bad ” and “ good ” splits Pretend for intuition that they alternate between best- case (n/2 : n/2) and worst-case (n-1 : 1) What happens if we bad-split root node, then good-split the resulting size (n-1) node?

88 Why does Qsort works well in practice? Intuitively, a real-life run of quicksort will produce a mix of “ bad ” and “ good ” splits Pretend for intuition that they alternate between best- case (n/2 : n/2) and worst-case (n-1 : 1) What happens if we bad-split root node, then good-split the resulting size (n-1) node? We end up with three subarrays, size 1, (n-1)/2, (n-1)/2 Combined cost of splits = n + n -1 = 2n -1 = O(n) No worse than if we had good-split the root node!

89 Why does Qsort works well in practice? Intuitively, the O(n) cost of a bad split (or 2 or 3 bad splits) can be absorbed into the O(n) cost of each good split Thus running time of alternating bad and good splits is still O(n lg n), with slightly higher constants How can we be more rigorous? : we’ll do average analysis of Qsort later while doing randomized algorithms.

90 Quicksort Vs Merge Sort Merge Sort takes O(n lg n) in the worst case Quick Sort takes O(n 2 ) in the worst case So why would anybody use Qsort instead of merge sort? Because in practice, Qsort is quick as the worst case doesn ’ t happen often.

91 Up Next Linear-Time Sorting Algorithms

92 The End


Download ppt "Instructor Neelima Gupta Introduction to some tools to designing algorithms through Sorting Iterative Divide and Conquer."

Similar presentations


Ads by Google