Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sorting Algorithms Written by J.J. Shepherd.

Similar presentations


Presentation on theme: "Sorting Algorithms Written by J.J. Shepherd."— Presentation transcript:

1 Sorting Algorithms Written by J.J. Shepherd

2 Sorting Review For each one of these sorting problems we are assuming ascending order so smallest to largest Three You’ve Seen Before Selection Bubble Insertion

3 Selection Sort Scans through the data structure and finds the smallest element then swaps that element with the first element Then it looks for the next smallest and does the same This is repeated until the end of the data structure is reached

4 Selection Sort Look for the smallest element in the array since the smallest value goes first index 1 2 3 4 5 6 7 8 9 10 value

5 Selection Sort The first value is assumed to be the smallest index 1 2
1 2 3 4 5 6 7 8 9 10 value

6 Selection Sort The next value is examine and it is smaller than the first index, so that’s assumed to be the smallest value. Store that index. index 1 2 3 4 5 6 7 8 9 10 value

7 Selection Sort This value is larger, so keep going index 1 2 3 4 5 6 7
1 2 3 4 5 6 7 8 9 10 value

8 Selection Sort This value is larger, so keep going index 1 2 3 4 5 6 7
1 2 3 4 5 6 7 8 9 10 value

9 Selection Sort This value is smaller, so store this index index 1 2 3
1 2 3 4 5 6 7 8 9 10 value

10 Selection Sort This value is larger, so move on index 1 2 3 4 5 6 7 8
1 2 3 4 5 6 7 8 9 10 value

11 Selection Sort This value is larger, so move on index 1 2 3 4 5 6 7 8
1 2 3 4 5 6 7 8 9 10 value

12 Selection Sort This value is smaller, so save this index index 1 2 3 4
1 2 3 4 5 6 7 8 9 10 value

13 Selection Sort This value is larger, so move on index 1 2 3 4 5 6 7 8
1 2 3 4 5 6 7 8 9 10 value

14 Selection Sort This value is larger, so move on index 1 2 3 4 5 6 7 8
1 2 3 4 5 6 7 8 9 10 value

15 Selection Sort Now we’ve reached the end so we swap the stored smallest value with the value at the first index index 1 2 3 4 5 6 7 8 9 10 value

16 Selection Sort That index is complete so we never test it again
We move on finding the next smallest value index 1 2 3 4 5 6 7 8 9 10 value

17 Selection Sort It starts on the next index index 1 2 3 4 5 6 7 8 9 10
1 2 3 4 5 6 7 8 9 10 value

18 Selection Sort After a while we discover that this is the next smallest value index 1 2 3 4 5 6 7 8 9 10 value

19 Selection Sort Swap these values index 1 2 3 4 5 6 7 8 9 10 value

20 Selection Sort Start the process again for the next smallest value
index 1 2 3 4 5 6 7 8 9 10 value

21 Selection Sort Eventually this is the result index 1 2 3 4 5 6 7 8 9
1 2 3 4 5 6 7 8 9 10 value

22 Example!

23 Selection Sort Theoretically how long does this take in the worst case scenario? Again let’s remember Big O One function (f(x)) is bounded by another (g(x)) given some large constant (M)

24 Selection Sort Let’s assume the data structure has n elements in there. Then how many times will this iteration run?

25 Selection Sort Search for the smallest element = n
Search for the next smallest = n-1 Search for the next smallest = n-2 The final element = 1

26 Selection Sort If we add all of these searches together we can say it roughly takes n2 times to sort every element. Thus O(n2)

27 Bubble Sort The idea is you keep swapping values which are out of order until no more swaps are made The idea is the largest values “bubble up” to the top of the data structure

28 Bubble Sort Examine the two side-by-side elements if the right one is larger than the left one swap index 1 2 3 4 5 6 7 8 9 10 value

29 Bubble Sort Left is larger than right. SWAP index 1 2 3 4 5 6 7 8 9 10
1 2 3 4 5 6 7 8 9 10 value

30 Bubble Sort Move forward index 1 2 3 4 5 6 7 8 9 10 value

31 Bubble Sort Left is larger than right SWAP! index 1 2 3 4 5 6 7 8 9 10
1 2 3 4 5 6 7 8 9 10 value

32 Bubble Sort Move forward index 1 2 3 4 5 6 7 8 9 10 value

33 Bubble Sort Left is larger than right. SWAP! index 1 2 3 4 5 6 7 8 9
1 2 3 4 5 6 7 8 9 10 value

34 Bubble Sort Move forward index 1 2 3 4 5 6 7 8 9 10 value

35 Bubble Sort Left is larger than right. SWAP! index 1 2 3 4 5 6 7 8 9
1 2 3 4 5 6 7 8 9 10 value

36 Bubble Sort Move forward index 1 2 3 4 5 6 7 8 9 10 value

37 Bubble Sort Left is larger than right. SWAP! index 1 2 3 4 5 6 7 8 9
1 2 3 4 5 6 7 8 9 10 value

38 Bubble Sort Move forward index 1 2 3 4 5 6 7 8 9 10 value

39 Bubble Sort Left is less than right so it is sorted. Move forward.
index 1 2 3 4 5 6 7 8 9 10 value

40 Bubble Sort Left is larger than right. SWAP! index 1 2 3 4 5 6 7 8 9
1 2 3 4 5 6 7 8 9 10 value

41 Bubble Sort Move forward index 1 2 3 4 5 6 7 8 9 10 value

42 Bubble Sort Left is larger than right. SWAP index 1 2 3 4 5 6 7 8 9 10
1 2 3 4 5 6 7 8 9 10 value

43 Bubble Sort Move forward index 1 2 3 4 5 6 7 8 9 10 value

44 Bubble Sort Left is larger than right. SWAP! index 1 2 3 4 5 6 7 8 9
1 2 3 4 5 6 7 8 9 10 value

45 Bubble Sort We’ve reached the end but since there was at least one swap the process has to start all over again from the beginning. index 1 2 3 4 5 6 7 8 9 10 value

46 Example!

47 Bubble Sort Theoretically how long does Bubble Sort run in the worst case scenario? What is the worst case scenario for bubble sort?

48 Bubble Sort The worst case scenario is we are given a data structure of n values that are sorted… Backwards Let’s examine the swaps involved with this case.

49 Bubble Sort The first iteration takes n swaps The next takes n-1 swaps
Finally 0 swaps

50 Bubble Sort If we add all of these swaps together we can say it roughly takes n2 times to sort every element. Thus O(n2)

51 Can we do Better?

52 Indeed!

53 Merge Sort A divide and conquer algorithm that splits apart a data structure in half over and over again and then finally merges the elements together piece by piece Similar concept to binary search but applied to sorting

54 Merge Sort Split the structure in half until single elements remain
index 1 2 3 4 5 6 7 8 9 10 value 1 2 3 4 9 5 6 7 1 2 3 4 10 8

55 Merge Sort Split the structure in half until single elements remain 1
1 2 3 4 9 5 6 7 1 2 3 4 10 8 1 2 9 5 6 1 7 3 1 2 4 10 1 2 8

56 Merge Sort Split the structure in half until single elements remain 1
1 2 9 5 6 1 7 3 1 2 4 10 1 2 8 1 9 5 6 7 3 1 4 10 1 2 8

57 Merge Sort Split the structure in half until single elements remain 1
1 2 9 5 6 1 7 3 1 2 4 10 1 2 8 1 9 5 6 7 3 1 4 10 1 2 8

58 Merge Sort Finally we have single elements so we can start merging 9 5
9 5 6 7 3 4 10 1 2 8

59 Merge Sort It’s sort of hard to see how merging works in the first step as it’s just one comparison 9 5 6 7 3 4 10 1 2 8 1 5 9 1 6 7 1 3 4 1 10 1 2 8

60 Merge Sort The idea of merging is for each smaller data structure we assume they have been sorted in the previous step. In this way we do not need to resort those data structure only sort them versus the others 1 5 9 1 6 7 1 3 4 1 10 1 2 8

61 Merge Sort Now we continue to merge 1 5 9 1 6 7 1 3 4 1 10 1 2 8

62 Merge Sort Check the first two values. The smaller one is added to the new data structure and its index is moved forward. The other remains the same 1 5 9 1 6 7 1 3 4 1 10 1 2 8 1 2 3

63 Merge Sort Check the indexed values. The smaller one is added to the new data structure and its index is moved forward. The other remains the same 1 5 9 1 6 7 1 3 4 1 10 1 2 8 1 2 3 5

64 Merge Sort Check the indexed values. The smaller one is added to the new data structure and its index is moved forward. The other remains the same 1 5 9 1 6 7 1 3 4 1 10 1 2 8 1 2 3 5 6

65 Merge Sort The second data structure reached its end so the rest of the first data structure is simply added to the end 1 5 9 1 6 7 1 3 4 1 10 1 2 8 1 2 3 5 6 7

66 Merge Sort The second data structure reached its end so the rest of the first data structure is simply added to the end 1 5 9 1 6 7 1 3 4 1 10 1 2 8 1 2 3 5 6 7 9

67 Merge Sort Similarly let’s look at the next merge 1 5 9 1 6 7 1 3 4 1
1 5 9 1 6 7 1 3 4 1 10 1 2 8 1 2 3 5 6 7 9 1 2 3

68 Merge Sort Similarly let’s look at the next merge 1 5 9 1 6 7 1 3 4 1
1 5 9 1 6 7 1 3 4 1 10 1 2 8 1 2 3 5 6 7 9 1 2 3

69 Merge Sort Similarly let’s look at the next merge 1 5 9 1 6 7 1 3 4 1
1 5 9 1 6 7 1 3 4 1 10 1 2 8 1 2 3 5 6 7 9 1 2 3

70 Merge Sort Similarly let’s look at the next merge 1 5 9 1 6 7 1 3 4 1
1 5 9 1 6 7 1 3 4 1 10 1 2 8 1 2 3 5 6 7 9 1 2 3 4

71 Merge Sort Similarly let’s look at the next merge 1 5 9 1 6 7 1 3 4 1
1 5 9 1 6 7 1 3 4 1 10 1 2 8 1 2 3 5 6 7 9 1 2 3 4 10

72 Merge Sort And the next one 1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6
1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6 7

73 Merge Sort And the next one 1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6
1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6 7

74 Merge Sort And the next one 1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6
1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6 7

75 Merge Sort And the next one 1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6
1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6 7

76 Merge Sort And the next one 1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6
1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6 7

77 Merge Sort And the next one 1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6
1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6 7

78 Merge Sort And the next one 1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6
1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6 7

79 Merge Sort And the next one 1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6
1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6 7 9

80 Merge Sort And the next one 1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6
1 2 3 5 6 7 9 1 2 3 4 10 1 2 8 1 2 3 4 5 6 7 9 10

81 Merge Sort Finally index 1 2 3 4 5 6 7 8 9 10 value

82 Example!

83 Merge Sort Theoretically how long does merge sort take?
There are essentially two steps that work in conjunction with each other Dividing the structure Merging it back together

84 Merge Sort We can actually visualize how long it takes n n/2 n/2 n/4
. 1 1 1 1

85 Merge Sort Dividing the structure takes lg(n) time n n/2 n/2 n/4 n/4
. 1 1 1 1

86 Merge Sort Merging takes n time n n/2 n/2 n/4 n/4 n/4 n/4 lg(n) . 1 1
1 1 n

87 Merge Sort If combine the dividing with the merging parts we finally get that it takes n*lg(n) time Thus O(nlgn)

88 Was All of This Really Worth it?

89 Common Big O Complexities

90 Quick Sort Look for the first element to the left that is larger than the pivot i j index 1 2 3 4 5 6 7 8 9 10 value

91 Quick Sort Look for the first element to the left that is larger than the pivot i j index 1 2 3 4 5 6 7 8 9 10 value

92 Quick Sort Look for the first element to the right of the pivot that’s less than the pivot i j index 1 2 3 4 5 6 7 8 9 10 value

93 Quick Sort Look for the first element to the right of the pivot that’s less than the pivot i j index 1 2 3 4 5 6 7 8 9 10 value

94 Quick Sort Swap those elements! i j index 1 2 3 4 5 6 7 8 9 10 value

95 Quick Sort Repeat that process. Look for one that’s greater than on the left side i j index 1 2 3 4 5 6 7 8 9 10 value

96 Quick Sort Look for one that is less than on the right side i j index
1 2 3 4 5 6 7 8 9 10 value

97 Quick Sort Swap! i j index 1 2 3 4 5 6 7 8 9 10 value

98 Quick Sort Continue on i j index 1 2 3 4 5 6 7 8 9 10 value

99 Quick Sort Continue on i j index 1 2 3 4 5 6 7 8 9 10 value

100 Quick Sort Continue on i j index 1 2 3 4 5 6 7 8 9 10 value

101 Quick Sort Swap! i index 1 2 3 4 5 6 7 8 9 10 value

102 Quick Sort Now since i = j we need to split the data structure and put the pivot in the center i j index 1 2 3 4 5 6 7 8 9 10 value

103 Quick Sort Now we repeat the same process for the smaller structures 1
1 2 3 4 1 2 3 4 5 7 6 10 9 8

104 Example!

105 Quick Sort How long does this take theoretically?
What is its worst case scenario?

106 Quick Sort Strangely enough its worst case scenario is an already sorted array. In this one unique case the pivot is selected every time and is swapped in and out of places n times for a data structure of size in so technically it is O(n2)

107 Quick Sort However since this is a rare case, and assuming the pivot is randomly chosen and not fixed then the average case becomes Q(nlgn)

108 Wrapping up Asymptotic
Big O (O) – is the worst case Big Omega (W) – is the best case scenario Bit Theta (Q) – is the average case scenario

109 Formal Definitions Big O for f(n) = O(g(n)) means there are positive constants c and k, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ k. The values of c and k must be fixed for the function f and must not depend on n. 

110 Formal Definitions Big O for f(n) = O(g(n)) means there are positive constants c and k, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ k. The values of c and k must be fixed for the function f and must not depend on n. 

111 Formal Definitions Big Omega for f(n) = Ω (g(n)) means there are positive constants c and k, such that 0 ≤ cg(n) ≤ f(n) for all n ≥ k. The values of c and k must be fixed for the function f and must not depend on n.

112 Formal Definitions Big Theta for f(n) = Θ (g(n)) means there are positive constants c1, c2, and k, such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ k. The values of c1, c2, and k must be fixed for the function f and must not depend on n.  IE in between Big O and Big Omega


Download ppt "Sorting Algorithms Written by J.J. Shepherd."

Similar presentations


Ads by Google