Updated 29.3.2004. QuickSort Problem From a given set of n integers, find the missing integer from 0 to n using O(n) queries of type: “what is bit[j]

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
CSE 3101: Introduction to the Design and Analysis of Algorithms
Introduction to Algorithms
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
CS4413 Divide-and-Conquer
CSC 331: Algorithm Analysis Divide-and-Conquer Algorithms.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
21/3/00SEM107- Kamin & ReddyClass 15 - Recursive Sorting - 1 Class 15 - Recursive sorting methods r Processing arrays by recursion r Divide-and-conquer.
QuickSort 4 February QuickSort(S) Fast divide and conquer algorithm first discovered by C. A. R. Hoare in If the number of elements in.
Quicksort CSE 331 Section 2 James Daly. Review: Merge Sort Basic idea: split the list into two parts, sort both parts, then merge the two lists
Efficient Sorts. Divide and Conquer Divide and Conquer : chop a problem into smaller problems, solve those – Ex: binary search.
Spring 2015 Lecture 5: QuickSort & Selection
CSE 373: Data Structures and Algorithms
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
Lecture 8 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 4 Comparison-based sorting Why sorting? Formal analysis of Quick-Sort Comparison.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
Quicksort.
TDDB56 DALGOPT-D DALG-C Lecture 8 – Sorting (part I) Jan Maluszynski - HT Sorting: –Intro: aspects of sorting, different strategies –Insertion.
Data Structures Review Session 1
CSC 2300 Data Structures & Algorithms March 20, 2007 Chapter 7. Sorting.
Sorting Importance of sorting Quicksort
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
CS2420: Lecture 11 Vladimir Kulyukin Computer Science Department Utah State University.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
Order Statistics The ith order statistic in a set of n elements is the ith smallest element The minimum is thus the 1st order statistic The maximum is.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Order Statistics. Order statistics Given an input of n values and an integer i, we wish to find the i’th largest value. There are i-1 elements smaller.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
CS 61B Data Structures and Programming Methodology July 28, 2008 David Sun.
September 29, Algorithms and Data Structures Lecture V Simonas Šaltenis Aalborg University
Introduction to Algorithms Jiafen Liu Sept
Sorting: Implementation Fundamental Data Structures and Algorithms Klaus Sutner February 24, 2004.
Quicksort CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
QuickSort (Ch. 7) Like Merge-Sort, based on the three-step process of divide- and-conquer. Input: An array A[1…n] of comparable elements, the starting.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
Divide and Conquer Sorting Algorithms COMP s1 Sedgewick Chapters 7 and 8.
CSC317 1 Quicksort on average run time We’ll prove that average run time with random pivots for any input array is O(n log n) Randomness is in choosing.
Sorting Algorithms Merge Sort Quick Sort Hairong Zhao New Jersey Institute of Technology.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
Computer Sciences Department1. Sorting algorithm 4 Computer Sciences Department3.
CSC317 1 Hiring problem-review Cost to interview (low C i ) Cost to fire/hire … (expensive C h ) n number of candidates m hired O (c i n + c h m) Independent.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting Input: sequence of numbers Output: a sorted sequence.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Parallel Sorting Algorithms
Chapter 4: Divide and Conquer
CO 303 Algorithm Analysis And Design Quicksort
Data Structures Review Session
Parallel Sorting Algorithms
Sub-Quadratic Sorting Algorithms
CS 583 Analysis of Algorithms
CSE 326: Data Structures Sorting
CS 3343: Analysis of Algorithms
CS 1114: Sorting and selection (part two)
CSE 373 Data Structures and Algorithms
The Selection Problem.
Design and Analysis of Algorithms
Divide and Conquer Merge sort and quick sort Binary search
Algorithm Course Algorithms Lecture 3 Sorting Algorithm-1
Presentation transcript:

Updated

QuickSort

Problem From a given set of n integers, find the missing integer from 0 to n using O(n) queries of type: “what is bit[j] in A[i]?” Note - there are a total of nlogn bits, so we are not allowed to read the entire input!

Solution Ask all the n integers what their last bit is and see whether 0 or 1 is the bit which occurs less often than it is supposed to. That is the last bit of the missing integer! How can we determine the second-to-last bit?

Solution Ask the n/2 numbers which ended with the correct last bit! By analyzing the bit patterns of the numbers from 0 to n which end with this bit. By recurring on the remaining candidate numbers, we get the answer in T(n) = T(n/2) + n =O(n), by the Master Theorem

Why is sorting so important? Most of the interesting concepts in the course can be taught in the context of sorting, such as: –Divide and conquer –Randomized algorithms –Lower bounds

Why is sorting so important? One of the reasons why sorting is so important, is that after sorting items, other problems become very simple to solve.

Searching Binary search runs on a sorted set in O(logn) time Searching for an element in a non sorted set take linear time This is probably the most important application of sorting

Element Uniqueness Given a set of numbers we want to check if all numbers are unique. Sort the elements and linearly scan all adjacent pairs.

Closest pairs Given n numbers, find the pair which are closest to each other. After sorting the elements, the closest pairs will be next to each other, so a linear scan will do. Related problems….

Frequency distribution Which element appears the largest number of times in a set. After sorting, a linear scan will do.

Median and Order statistics What is the median of a set of numbers? What is the k-th largest element? After sorting the elements the k-th largest element can be found at index k, in constant time

Convex hulls Given n points in two dimensions, find the smallest area polygon which contains them all.

Huffman Codes If you are trying to minimize the size of a text file, you should want to assign different lengths to represent different characters, according to the frequency in which each character appears in the text.

Quicksort Although mergesort is O(nlogn), it is not convenient to be used with arrays, since it requires extra space. In practice, Quicksort is the fastest algorithm and it uses partition as its main idea.

Quicksort Partitioning places all the elements less than the pivot in the left part of the array, and all elements greater than the pivot in the right part of the array. The pivot fits in the slot between them.

Partition Example – use 10 as a pivot Note that the pivot element ends up in the correct place in the total order! Before After

Partition First we must select a pivot element Once we have selected a pivot element, we can partition the array in one linear scan, by maintaining three sections of the array: –All elements smaller than the pivot –All elements greater than the pivot –All unexplored elements

Example: pivot element is 10 | | 10 | | 17 5 | | 17 5 | | | | | | | 19 | ||

Quicksort Partition does at most n swaps and takes linear time. –The pivot element ends up in the position it retains in the final sorted order. –After a partitioning, no element flops to the other side of the pivot in the final sorted order. –Thus we can sort the elements to the left of the pivot and the right of the pivot independently! And recursively

QuickSort QuickSort(A, p,r) if (p < r) then q  Partition(A,p,r) QuickSort(A,p,q) QuickSort(A,q+1,r) QuickSort(A,1,length[A])

QuickSort public void sort (Comparable[] values) { sort (values, 0, values.length - 1); } private void sort (Comparable[] values, int from, int to) { if (from < to) { int pivot = partition (values, from, to); sort(values, from, pivot); sort(values, pivot + 1, to); }

private int partition (Comparable[] values, int from, int to) { Comparable pivot = values[from]; int j = to + 1; int i = from - 1; while (true) { do { j--; } while (values[j].compareTo(pivot) >= 0); do { i++; } while (values[i].compareTo(pivot) < 0); if (i < j) { Comparable temp = values[i]; values[i] = values[j]; values[j] = temp; } else { return j; }

Partition The partition method returns the index separating the array, but also has a side effect, which is swapping the elements in the array according to their size ij

Partition ij ij

Partition ij ij ji

Partition – version 2 public int partition (int[] values, int from, int to) { int pivot = values[from]; int leftWall = from; for (int i = from + 1; i <= to; i++) { if (values[i] < pivot) { leftWall++; int temp = values[i]; values[i] = values[leftWall]; values[leftWall] = temp; } int temp = values[from]; values[from] = values[leftWall]; values[leftWall] = temp; return leftWall; }

Partition lwi lw

Partition lwi lwi

Partition lwi lwi lwi

Partition (A[], left, right) 1. pivot  left 2. temp  right 3. while temp <> pivot 4. if A[min(pivot,temp] > A[max(pivot,temp)] 5.swap (A[pivot],A[temp]) 6. swap (pivot,temp) 7. if temp > pivot 8. temp— 9. else 10.temp return pivot

Example

Time Analysis The running time for quick sort depends on how equally partition divides the array. The chosen pivot element determines how equally partition divides the array. If partition results in equal sub arrays quicksort can be as good as merge sort If partition to equally divide the array, quicksort may be asymptotically as worse as insertion sort.

Best case Since each element ultimately ends up in the correct position, the algorithm correctly sorts. But how long does it take? The best case for divide-and-conquer algorithms comes when we split the input as evenly as possible. Thus in the best case, each subproblem is of size n/2.

Best case The partition step on each subproblem is linear in its size. Thus the total effort in partitioning each step is O(n). The total partitioning on each level is O(n), and it take log(n) levels of perfect partitions to get to single element. The total effort is therefore O(nlogn)

Best case

Worst case

Worst case If the pivot is the biggest or smallest element in the array, the sub-problems will be divided into size 0 and n-1, thus instead of log(n) recursive steps we end up with O(n) recursive steps and a total sorting time of O(n^2)

Worst case The worst case input for quick sort depends on the way we choose the pivot element. If we choose the first or last element as the pivot, the worst case is when the elements are already sorted!!

Worst case Having the worst case occur in a sorted array is bad, since this is an expected case in many applications. (Insertion sort deals with sorted arrays in linear time.) To eliminate this problem, pick a better pivot: 1.Use a random element of the array as the pivot. 2.take the median of three elements (first, last, middle) as the pivot. 3.The worst case remains, however, because the worst case is no longer a natural order it is much more difficult to occur.

Randomization Quicksort is good on average, but bad on certain worst-case instances. Suppose you picked the pivot element at random. Your enemy select a worst case input because you would have the same probability for a good pivot! By either picking a random pivot or scrambling the permutation before sorting it, we can say: ``With high probability, randomized quicksort runs in O(nlogn) time.''

Time Analysis Worst Case

Time Analysis Best Case

Time Analysis Average Case

Time Analysis