Sorting Chapter 6 Chapter 6 –Insertion Sort 6.1 –Quicksort 6.2 Chapter 5 Chapter 5 –Mergesort 5.2 –Stable Sorts 223-224 Divide & Conquer.

Slides:



Advertisements
Similar presentations
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
Advertisements

Stephen P. Carl - CS 2421 Recursive Sorting Algorithms Reading: Chapter 5.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Lecture 2: Divide and Conquer algorithms Phan Thị Hà Dương
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Quicksort CSE 331 Section 2 James Daly. Review: Merge Sort Basic idea: split the list into two parts, sort both parts, then merge the two lists
Spring 2015 Lecture 5: QuickSort & Selection
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Quicksort COMP171 Fall Sorting II/ Slide 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N.
Data Structures Data Structures Topic #13. Today’s Agenda Sorting Algorithms: Recursive –mergesort –quicksort As we learn about each sorting algorithm,
Lecture 7COMPSCI.220.FS.T Algorithm MergeSort John von Neumann ( 1945 ! ): a recursive divide- and-conquer approach Three basic steps: –If the.
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
Quicksort Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Insertion sort, Merge sort COMP171 Fall Sorting I / Slide 2 Insertion sort 1) Initially p = 1 2) Let the first p elements be sorted. 3) Insert the.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
1 TCSS 342, Winter 2005 Lecture Notes Sorting Weiss Ch. 8, pp
CPSC 411, Fall 2008: Set 2 1 CPSC 411 Design and Analysis of Algorithms Set 2: Sorting Lower Bound Prof. Jennifer Welch Fall 2008.
Quicksort. 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N 2 ) n But, the worst case seldom.
Merge sort, Insertion sort
Sorting. Introduction Assumptions –Sorting an array of integers –Entire sort can be done in main memory Straightforward algorithms are O(N 2 ) More complex.
Quicksort.
Sorting Chapter 6. 2 Algorithms to know Insertion Sort Insertion Sort –O(n 2 ) but very efficient on partially sorted lists. Quicksort Quicksort –O(n.
Quicksort CIS 606 Spring Quicksort Worst-case running time: Θ(n 2 ). Expected running time: Θ(n lg n). Constants hidden in Θ(n lg n) are small.
Divide and Conquer Sorting
Merge sort, Insertion sort. Sorting I / Slide 2 Sorting * Selection sort or bubble sort 1. Find the minimum value in the list 2. Swap it with the value.
Sorting Lower Bound Andreas Klappenecker based on slides by Prof. Welch 1.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
Design and Analysis of Algorithms – Chapter 51 Divide and Conquer (I) Dr. Ying Lu RAIK 283: Data Structures & Algorithms.
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
CS2420: Lecture 11 Vladimir Kulyukin Computer Science Department Utah State University.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
CSE 373 Data Structures Lecture 19
Computer Algorithms Lecture 10 Quicksort Ch. 7 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Sorting and Asymptotic Complexity
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Computer Science 101 Fast Searching and Sorting. Improving Efficiency We got a better best case by tweaking the selection sort and the bubble sort We.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
COMP 171 Data Structures and Algorithms Tutorial 3 Merge Sort & Quick Sort.
Merge sort, Insertion sort. Sorting I / Slide 2 Sorting * Selection sort (iterative, recursive?) * Bubble sort.
Sorting CSIT 402 Data Structures II. 2 Sorting (Ascending Order) Input ›an array A of data records ›a key value in each data record ›a comparison function.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Chapter 8 Sorting and Searching Goals: 1.Java implementation of sorting algorithms 2.Selection and Insertion Sorts 3.Recursive Sorts: Mergesort and Quicksort.
Quicksort CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
QuickSort (Ch. 7) Like Merge-Sort, based on the three-step process of divide- and-conquer. Input: An array A[1…n] of comparable elements, the starting.
Sorting.
CS221: Algorithms and Data Structures Lecture #4 Sorting Things Out Steve Wolfman 2011W2 1.
Sorting Fundamental Data Structures and Algorithms Aleks Nanevski February 17, 2004.
Young CS 331 D&A of Algo. Topic: Divide and Conquer1 Divide-and-Conquer General idea: Divide a problem into subprograms of the same kind; solve subprograms.
Nirmalya Roy School of Electrical Engineering and Computer Science Washington State University Cpt S 122 – Data Structures Sorting.
Sorting Algorithms Merge Sort Quick Sort Hairong Zhao New Jersey Institute of Technology.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Quicksort Quicksort is a well-known sorting algorithm that, in the worst case, it makes Θ(n 2 ) comparisons. Typically, quicksort is significantly faster.
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
SORTING AND ASYMPTOTIC COMPLEXITY Lecture 13 CS2110 – Fall 2009.
Computer Sciences Department1. Sorting algorithm 4 Computer Sciences Department3.
329 3/30/98 CSE 143 Searching and Sorting [Sections 12.4, ]
1 Chapter 7 Quicksort. 2 About this lecture Introduce Quicksort Running time of Quicksort – Worst-Case – Average-Case.
Advanced Sorting.
Chapter 7 Sorting Spring 14
Chapter 4: Divide and Conquer
Quick Sort (11.2) CSE 2011 Winter November 2018.
CSE 332: Data Abstractions Sorting I
CSE 373 Data Structures and Algorithms
CSE 332: Sorting II Spring 2016.
Presentation transcript:

Sorting Chapter 6 Chapter 6 –Insertion Sort 6.1 –Quicksort 6.2 Chapter 5 Chapter 5 –Mergesort 5.2 –Stable Sorts Divide & Conquer

Three Algorithms to know Insertion Sort (not divide-and-conquer) Insertion Sort (not divide-and-conquer) –O(n 2 ) worst-case, best-case, average-case –very efficient on partially sorted lists. Mergesort (divide-and-conquer) Mergesort (divide-and-conquer) –O(n log n) worst-case, best-case, average-case –stable but not the most efficient Quicksort (divide-and-conquer) Quicksort (divide-and-conquer) –O(n 2 ) worst-case but O(n log n) average case. –Improvements make it the fastest practical sorting algorithm

1. Insertion Sort is(a[], n) { // Assume first item a[o] is sorted for i = 1 to n-1 { // Sort a[1] to a[n-1] v = a[i];// v is item we want to insert j = i –1;// j will iterate from i down to 0 // Keep looping until you insert v while (j >= 0 && v = a[j] a[j+1] = a[j]; // Shift to make room j--;} a[j+1] = v;// Insert v into correct position }}

1. Insertion Sort 10 items Worst Case 10*9/2 = 45 comparisons n*(n-1)/2 = O(n 2 )

1. Insertion Sort 10 items Best Case 9 comparisons O(n)

1. Insertion Sort 10 items Average Case

2. Mergesort mergsort(a[], i, j) { if (i == j) return;// If sub-list is size 1 m = (i+j)/2;// Compute mid-point mergesort(a, i, m);// Mergesort first half mergesort(a, m+1, j); // Mergesort second half merge(a, i, m, j); // Merge the two halves // Merging is O(n), where n = a + b, where a and b are the size of the two sublists. }

2. Mergesort mergsort(a[], i, j) { if (i == j) return;// Base case m = (i+j)/2;// O(1) mergesort(a, i, m);// Recursive call mergesort(a, m+1, j); // Recursive call merge(a, i, m, j); // O(n) }

Recall the Big Hammer: a = # recursive calls b = 2 (if you cut the input size in half) n k  is the running time of the actual function (irrespective of recursion) 1. a < b k T(n) = Θ(n k ) 2. a = b k T(n) = Θ(n k log n ) 3. a > b k T(n) = Θ( n log b a )

2. The Merge Function ?????????

??????10??

??????1012?

?????1012?

????1012?

???1012?

??1012?

?1012?

?

2. Merge Function While the merge function does “minimal” comparisons, it must do a lot of “moves” i.e., swaps, copies, etc. While the merge function does “minimal” comparisons, it must do a lot of “moves” i.e., swaps, copies, etc. In fact, it is nearly impossible to do the merge efficiently without using extra memory. In fact, it is nearly impossible to do the merge efficiently without using extra memory. For each merge, all the items must be moved at least twice. For each merge, all the items must be moved at least twice. Moved once in the temp array and then back again. Moved once in the temp array and then back again.

2. Merge Function Notice the left and right sub-lists are sorted. Notice the left and right sub-lists are sorted. They must be sorted otherwise the merge function won’t work! period They must be sorted otherwise the merge function won’t work! period

2. Merge Function The first comparison does NOT require any swaps The first comparison does NOT require any swaps

2. Merge Function The second comparison does The second comparison does 10 is less than 30, so we should swap 10 is less than 30, so we should swap

2. Merge Function But, how do we continue the merge? But, how do we continue the merge? Which pointer should be moved? Which pointer should be moved?

2. Merge Function The values 1 and 10 or sorted, so moving the blue pointer makes sense, right? But where should we move it? The values 1 and 10 or sorted, so moving the blue pointer makes sense, right? But where should we move it? ??

2. Merge Function We can’t move it to the 40 because then we’ll compare 40 with 30 next, and we’ll miss 15, which is smaller. We can’t move it to the 40 because then we’ll compare 40 with 30 next, and we’ll miss 15, which is smaller ??

2. Merge Function Now, we are in big trouble: Now, we are in big trouble: 1. How exactly will the 15 get into the correct position? 2. What happened to our 2 sorted sub- lists? ?

2. Merge Function Catch-22 The only way to efficiently merge sorted sub-list is to use an extra temp array to do the swapping. The only way to efficiently merge sorted sub-list is to use an extra temp array to do the swapping. However, if you use linked list, then you can efficiently merge without extra memory However, if you use linked list, then you can efficiently merge without extra memory Double However, with link lists you can’ t easily jump to the mid-point of each sub- list, so extra iteration is required. Double However, with link lists you can’ t easily jump to the mid-point of each sub- list, so extra iteration is required.

2. Mergesort Conclusion Lesson: Minimizing comparisons is good Minimizing comparisons is good But using extra memory and But using extra memory and –having to move so much data around… …makes Mergesort one of the most inefficient O(n log n) sorting algorithms, no matter how you implement it. …makes Mergesort one of the most inefficient O(n log n) sorting algorithms, no matter how you implement it.

Quicksort quicksort (a[], i, j) { if (i < j) { p = partition(a, i, j); quicksort(a, i, p-1); quicksort(a, p+1, j); }} partition(a[],i,j) { v = a[i]; h = i; for k = i+1 to j if (a[k] < v) { h++;swap(a[h],a[k]);} swap (a[i],a[h]); return h; }

Quicksort 31 items Best Case O(n log n) log n levels O(n) work for each level

Quicksort 31 items Worst Case O(n 2 ) n levels O(n) work for each level...

Quicksort 31 items Average Case O(n log n) log n levels O(n) work for each level Proof on page

Quicksort vs. Mergesort Mergesort -close to minimum number of comparisons Mergesort -close to minimum number of comparisons –But every comparison requires moving two values a[i]  c[?]  a[?]. –Mergesort always moves 2n log n values. Quicksort - does more comparisons, but less moves or swaps Quicksort - does more comparisons, but less moves or swaps –Also, once the pivot is placed, it is never moved, i.e., its in the correct position.

Important Concept General Sorting General Sorting –The only thing you are allowed to do is compare two items at a time. Special Sorting Special Sorting –It is possible to compare one item with every other item in one operation. –How so?

Decision Tree for Sorting a<=b abc b <= c a <= c b <= c abc bac acbcabbcacba

Decision Tree for Sorting a<=b abc b <= c a <= c d 12 permutations 6 permutations 3 permutations 2 permutations?

Lower-bound on General Sorting h >= log(n!) h >= log(n!) h is the minimum number of comparisons h is the minimum number of comparisons log(n!) =  (n log n) (well known 2.3.8) log(n!) =  (n log n) (well known 2.3.8) Thus, log(n!) =  (n log n) and Thus, log(n!) =  (n log n) and h =  (n log n) h =  (n log n)