Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sorting CS 105 See Chapter 14 of Horstmann text. Sorting Slide 2 The Sorting problem Input: a collection S of n elements that can be ordered Output: the.

Similar presentations


Presentation on theme: "Sorting CS 105 See Chapter 14 of Horstmann text. Sorting Slide 2 The Sorting problem Input: a collection S of n elements that can be ordered Output: the."— Presentation transcript:

1 Sorting CS 105 See Chapter 14 of Horstmann text

2 Sorting Slide 2 The Sorting problem Input: a collection S of n elements that can be ordered Output: the same collection of elements arranged in increasing (or non-decreasing) order *typically, S would be stored in an array, and the problem is to rearrange the elements in that array for now, let’s assume we are sorting a collection of integers

3 Sorting Slide 3 Example 5310262128932818 2101828536293128

4 Sorting Slide 4 Some sorting algorithms Insertion sort Selection sort Bubble sort Quick sort Merge sort Heap sort Bucket sort Radix sort O( n 2 ) O( n log n ) O( n ) *the last 3 algorithms will be discussed later this semester Discussion of Selection sort and Merge sort in Horstmann text

5 Sorting Slide 5 Insertion sort Strategy: treat each s[i] as an incoming element that you will insert into the already sorted sequence s[0],s[i],…s[i-1] Requires locating the proper position of the incoming element and adjusting elements to the right Best case: the array is already sorted so that no “insertions” are carried out -> O(n) Worst case: the array is in decreasing order; incoming elements are always inserted at the beginning -> O( n 2 )

6 Sorting Slide 6 Insertion sort for i  1 to n-1 do temp  s[i] // incoming element j  i // adjust elements to the right while ( j > 0 && s[j-1] > temp ) s[j] = s[j-1]; j--; s[j] = temp; // insert incoming element

7 Sorting Slide 7 Insertion sort example * Image taken from Shaffer, 2001

8 Sorting Slide 8 Selection sort Strategy: locate the minimum element, place it at the first position, locate the next minimum and place it at the second position … Requires a scan ( O(n) ) for each of the n elements -> O( n 2 ) best and worst case Variation: can repeatedly select the maximum instead and place it at the last position

9 Sorting Slide 9 Selection sort for i  0 to n-2 do lowIndex  i// determine for j  i+1 to n-1 do// minimum if (s[j] < s[lowIndex] ) lowIndex  j swap( s[i], s[lowIndex] ) // place minimum // in proper place Why not n-1 ?

10 Sorting Slide 10 Selection sort example * Image taken from Shaffer, 2001

11 Sorting Slide 11 Selection sort variation Repeatedly select maximum instead for i  n-1 downto 1 do highIndex  i// determine for j  0 to i-1 do// maximum if (s[j] > s[highIndex] ) highIndex  j swap( s[i], s[highIndex] ) // place maximum // in proper place

12 Sorting Slide 12 Bubble sort Essentially selection sort but the sort is carried out by swapping adjacent elements only Minimum elements are repeatedly “bubbled-up”, maximum elements are repeatedly “bubbled-down” the array O( n 2 ) because of the comparisons (actual swaps are carried out only when elements are out of place)

13 Sorting Slide 13 Bubble sort for i  n-1 down to 1 do for j  0 to i-1 do if (s[j] > s[j+1] ) swap( s[j], s[j+1] ) Puts the ith element in its proper place Repeatedly positions maximum element

14 Sorting Slide 14 Exercise: Bubble sort Perform a trace for this array 5310262128932818

15 Sorting Slide 15 Bubble sort variation for i  0 to n-2 do for j  n-1 to i+1 do if (s[j] < s[j-1] ) swap( s[j], s[j-1] ) Puts the ith element in its proper place Repeatedly positions MINIMUM element

16 Sorting Slide 16 Time complexity summary AlgorithmBest case Worst case Insertion sort O( n )O(n 2 ) Selection sort O(n 2 ) Bubble sortO(n 2 )

17 Sorting Slide 17 Improved sorting strategy Divide-and-Conquer Given the collection of n elements to sort: perform the sort in three steps Divide step: split the collection S into two subsets, S1 and S2 Recursion step: sort S1 and S2 separately Conquer step: combine the two lists into one sorted list

18 Sorting Slide 18 Quick sort and Merge sort Two algorithms adopt this divide-and-conquer strategy Quick sort Work is carried out in the divide step using a pivot element Conquer step is trivial Merge sort Divide step is trivial – just split the list into two equal parts Work is carried out in the conquer step by merging two sorted lists

19 Sorting Slide 19 Quick sort: divide step In the divide step, select a pivot from the array (say, the last element) Split the list/array S using the pivot: S1 consists of all elements pivot 8524634517319650 2445173185639650 S1 S2

20 Sorting Slide 20 Quick sort: conquer step After sorting S1 and S2, combine the sorted lists so that S1 is on the left, the pivot is in the middle, and S2 is on the right 1724314550638596 1724314563859650 S2-sortedS1-sorted

21 Sorting Slide 21 Quick sort with recur step 1724314550638596 17243145638596 50 S2-sortedS1-sorted 8524634517319650 2445173185639650 S1 S2 Divide Conquer Recur

22 Sorting Slide 22 Implementing quick sort It is preferable if we can perform quick-sort in-place; i.e., we sort by swapping elements in the array, perhaps using some temporary variables Plan: algorithm QSort( S, a, b ) sorts the sublist in the array S from index a to index b QSort( L, 0, n-1 ) will sort an array L of length n Within the QSort( S, a, b ) algorithm, there will be recursive calls to QSort on smaller ranges within the range a…b.

23 Sorting Slide 23 Algorithm QSort Algorithm QSort( S, a, b ) if ( a < b ) p  S[b] rearrange S so that: S[a]…S[x-1] are elements < p S[x] = p S[x+1]…S[b] are elements > p QSort( S, a, x-1 ) QSort( S, x+1, b ) base case: a…b range contains 0 or 1 element

24 Sorting Slide 24 Rearranging a sublist in S p  S[b], l  a, r  b - 1 while l <= r do // find an element larger than the pivot while l <= r and S[l] <= p do l  l + 1 // find an element smaller than the pivot while r >= l and S[r] >= p do r  r – 1 if l < r then swap ( S[l], S[r] ) // swap the two elements swap( S[l], S[b] ) // place pivot in proper place

25 Sorting Slide 25 Time complexity of quick sort First, note that rearranging a sublist takes O( n ) time, where n is the length of the sublist Requires scanning the list from both ends until both l and r pointers meet O( n ) even if loops are nested within a loop Rearranging sublists is all that the quick sort algorithm does Need to find out how often the sort would perform the rearrange operation

26 Sorting Slide 26 Time complexity of quick sort Suppose the pivots always split the lists into two lists of roughly equal size... 1 list of length n 2 lists of length n/2 4 lists of length n/4 n lists of length 1

27 Sorting Slide 27 Time complexity of quick sort Each level takes O( n ) time for sublist rearranging Assuming an even split caused by each pivot, there will be around log n levels Therefore, quick sort takes O( n log n ) time But…

28 Sorting Slide 28 Time complexity of quick sort In the worst case, the pivot might split the lists such that there is only 1 element in one partition (not an even split) There will be n levels Each level requires O( n ) time for sublist rearranging Quick sort takes O( n 2 ) time in the worst case

29 Sorting Slide 29 Merge sort Another sorting algorithm using the divide-and- conquer paradigm This time, the hard work is carried out in the conquer phase instead of the divide phase Divide: split the list S[0..n-1] by taking the middle index m ( = (0 + n-1) / 2 ) Recursion: recursively sort S[0..m] and S[m+1..n-1] Conquer: merge the two sorted lists (how?)

30 Sorting Slide 30 Merge sort 1724314550638596 24456385 S2-sortedS1-sorted 8524634517319650 85246345 S1 S2 Divide Conquer Recur 17319650 17315096

31 Sorting Slide 31 Merging two sorted lists Requires inspecting the “head elements” of both lists Whichever element is smaller, that element goes first, and the head element for the “winning” list is updated so that it refers to the next element in that list Repeat the process until all the elements in both lists have been processed

32 Sorting Slide 32 Merge sort time complexity Divide step ensures that the sublist split is done evenly O( log n ) levels Conquer/merge step takes O( n ) time per level Time complexity is O( n log n ), guaranteed Disadvantage: hard to carry out the merge step in-place; temporary array/list is necessary if we want a simple implementation

33 Sorting Slide 33 Time complexity summary AlgorithmBest case Worst case Quick sortO( n log n )O(n 2 ) Merge sortO( n log n )

34 Sorting Slide 34 Summary and final points O( n 2 ) algorithms (insertion-, selection-, bubble- sort) are easy to code but are inefficient Quick sort has an O( n 2 ) worst case but works very well in practice; O( n log n ) on the average Merge sort is difficult to implement in-place but O( n log n ) complexity is guaranteed Note: it doesn’t perform well in practice Later this semester A guaranteed O( n log n ) algorithm called Heap sort that is a reasonable alternative to quick sort O( n ) sorting algorithms (some restrictions on input)


Download ppt "Sorting CS 105 See Chapter 14 of Horstmann text. Sorting Slide 2 The Sorting problem Input: a collection S of n elements that can be ordered Output: the."

Similar presentations


Ads by Google