Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSE 326: Data Structures Sorting

Similar presentations


Presentation on theme: "CSE 326: Data Structures Sorting"— Presentation transcript:

1 CSE 326: Data Structures Sorting
Ben Lerner Summer 2007

2 Sorting: The Big Picture
Given n comparable elements in an array, sort them in an increasing (or decreasing) order. Simple algorithms: O(n2) Fancier algorithms: O(n log n) Comparison lower bound: (n log n) Specialized algorithms: O(n) Handling huge data sets Insertion sort Selection sort Bubble sort Shell sort Heap sort Merge sort Quick sort Bucket sort Radix sort External sorting

3 O(n2) : move k/2 in kth step
Insertion Sort: Idea At the kth step, put the kth input element in the correct place among the first k elements Result: After the kth step, the first k elements are sorted. Runtime: worst case : best case : average case : O(n2) : reverse input O(n) : sorted input O(n2) : move k/2 in kth step

4 Example

5 Example

6 Try it out: Insertion sort

7 Selection Sort: idea Find the smallest element, put it 1st
Find the next smallest element, put it 2nd Find the next smallest, put it 3rd And so on …

8 O(n2) : must select in linear time
Selection Sort: Code void SelectionSort (Array a[0..n-1]) { for (i=0, i<n; ++i) { j = Find index of smallest entry in a[i..n-1] Swap(a[i],a[j]) } Runtime: worst case : best case : average case : O(n2) : must select in linear time Ask About Heap sort ?

9 Try it out: Selection sort
Insert 31, 16, 54, 4, 2, 17, 6

10 HeapSort: Using Priority Queue ADT (heap)
Buildheap = N Insert = log N (avg O(1)) Deletemin = log N Shove all elements into a priority queue, take them out smallest to largest. Runtime: Worst, avg: O(n log n)

11 Merge Sort MergeSort (Array [1..n]) Merge (a1[1..n],a2[1..n])
1. Split Array in half 2. Recursively sort each half 3. Merge two halves together Merge (a1[1..n],a2[1..n]) i1=1, i2=1 While (i1<n, i2<n) { if (a1[i1] < a2[i2]) { Next is a1[i1] i1++ } else { Next is a2[i2] i2++ } Now throw in the dregs… “The 2-pointer method”

12 Mergesort example 8 2 9 4 5 3 1 6 Divide 8 2 9 4 5 3 1 6
1-element _ Merge Merge Final:

13 Try it out: Merge sort Insert 31, 16, 54, 4, 2, 17, 6

14 Merge Sort: Complexity
Want: n/2k = 1 n = 2k log n = k Base case: T(1) = c T(n) = 2 T(n/2) + n T(n) = O(n log n) (best, worst) Merge Sort: Complexity Base case: T(1) = c T(n) = 2 T(n/2) + n = 2 (2T(n/4) +n/2) + n =4T(n/4)+n+n =4T(n/4)+2n =4 (2T(n/8) +n/4) + 2n =8T(n/8)+n+2n =8T(n/8)+3n =2k T(n/2k) + kn = nT(1) +n logn = n+ n log n

15 The steps of QuickSort S S1 S2 S1 S2 S
Details: how to pick a good pivot? S select pivot value 81 31 57 43 13 75 92 65 26 S1 S2 partition S 31 75 43 13 65 81 92 26 57 QuickSort(S1) and QuickSort(S2) S1 S2 13 26 31 43 57 65 75 81 92 S 13 26 31 43 57 65 75 81 92 Presto! S is sorted [Weiss]

16 Show how to select a pivot Show how to do the partition
QuickSort Example 8 1 4 9 3 5 2 7 6 i j Choose the pivot as the median of three. Place the pivot and the largest at the right and the smallest at the left

17 QuickSort Example Move i to the right to be larger than pivot.
1 4 9 7 3 5 2 6 8 i j Move i to the right to be larger than pivot. Move j to the left to be smaller than pivot. Swap

18 QuickSort Example S1 < pivot pivot S2 > pivot 1 4 2 5 3 7 9 6 8
1 4 2 5 3 7 9 6 8 i j S1 < pivot pivot S2 > pivot

19 Recursive Quicksort Don’t use quicksort for small arrays.
Quicksort(A[]: integer array, left,right : integer): { pivotindex : integer; if left + CUTOFF  right then pivot := median3(A,left,right); pivotindex := Partition(A,left,right-1,pivot); Quicksort(A, left, pivotindex – 1); Quicksort(A, pivotindex + 1, right); else Insertionsort(A,left,right); } Don’t use quicksort for small arrays. CUTOFF = 10 is reasonable.

20 Try it out: Recursive quicksort
Insert 31, 16, 54, 4, 2, 17, 6

21 QuickSort: Best case complexity
T(n) = 2T(n/2) + n T(n) = O(n log n) Same as Mergesort What is best case? Always chooses a pivot that splits array in half at each step

22 QuickSort: Worst case complexity
T(n) = n + T(n-1) T(n) = O(n2) T(1) = c T(n) = n + T(n-1) T(n) = n + (n-1) + T(n-2) T(n) = n + (n-1) + (n-2)+ T(n-3) T(n) = …+N T(n) = O(n2) Always chooses WORST pivot – so that one array is empty at each step

23 QuickSort: Average case complexity
T(n) = n + T(n-1) T(n) = O(n2) Turns out to be O(n log n) See Section for an idea of the proof. Don’t need to know proof details for this course. In-place (but uses extra storage due to recursion – no iterative w/o a stack) O(n log n ) average but O(n2) worst case Ends up being very fast in practice, for large arrays – a good choice

24 Features of Sorting Algorithms
In-place Sorted items occupy the same space as the original items. (No copying required, only O(1) extra space if any.) Stable Items in input with the same value end up in the same order as when they began.

25 Sort Properties Are the following: stable? in-place?
Your Turn Sort Properties Are the following: stable? in-place? Insertion Sort? No Yes Can Be No Yes Selection Sort? No Yes Can Be No Yes MergeSort? No Yes Can Be No Yes QuickSort? No Yes Can Be No Yes


Download ppt "CSE 326: Data Structures Sorting"

Similar presentations


Ads by Google