LECTURE 40: SELECTION CSC 212 – Data Structures.  Sequence of Comparable elements available  Only care implementation has O(1) access time  Elements.

Slides:



Advertisements
Similar presentations
A simple example finding the maximum of a set S of n numbers.
Advertisements

CSC 213 – Large Scale Programming. Today’s Goals  Review discussion of merge sort and quick sort  How do they work & why divide-and-conquer?  Are they.
CSC 213 – Large Scale Programming or. Today’s Goals  Begin by discussing basic approach of quick sort  Divide-and-conquer used, but how does this help?
Data Structures Lecture 9 Fang Yu Department of Management Information Systems National Chengchi University Fall 2010.
CS 3343: Analysis of Algorithms Lecture 14: Order Statistics.
Introduction to Algorithms
Introduction to Algorithms Jiafen Liu Sept
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Merge Sort 4/15/2017 4:30 PM Merge Sort 7 2   7  2  2 7
CSC 331: Algorithm Analysis Divide-and-Conquer Algorithms.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Quicksort Quicksort     29  9.
Spring 2015 Lecture 5: QuickSort & Selection
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
© 2004 Goodrich, Tamassia QuickSort1 Quick-Sort     29  9.
Updated QuickSort Problem From a given set of n integers, find the missing integer from 0 to n using O(n) queries of type: “what is bit[j]
Lecture 8 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
Quick-Sort     29  9.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
© 2004 Goodrich, Tamassia Selection1. © 2004 Goodrich, Tamassia Selection2 The Selection Problem Given an integer k and n elements x 1, x 2, …, x n, taken.
TTIT33 Algorithms and Optimization – Dalg Lecture 2 HT TTIT33 Algorithms and optimization Lecture 2 Algorithms Sorting [GT] 3.1.2, 11 [LD] ,
© 2004 Goodrich, Tamassia Merge Sort1 Quick-Sort     29  9.
Functional Design and Programming Lecture 4: Sorting.
Median, order statistics. Problem Find the i-th smallest of n elements.  i=1: minimum  i=n: maximum  i= or i= : median Sol: sort and index the i-th.
Quicksort!. A practical sorting algorithm Quicksort  Very efficient Tight code Good cache performance Sort in place  Easy to implement  Used in older.
1 CSC401 – Analysis of Algorithms Lecture Notes 9 Radix Sort and Selection Objectives  Introduce no-comparison-based sorting algorithms: Bucket-sort and.
Selection1. 2 The Selection Problem Given an integer k and n elements x 1, x 2, …, x n, taken from a total order, find the k-th smallest element in this.
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
CSC 213 Lecture 15: Sets, Union/Find, and the Selection Problem.
CSC 213 – Large Scale Programming. Today’s Goals  Review past discussion of data sorting algorithms  Weaknesses of past approaches & when we use them.
Sorting.
(c) , University of Washington
CSC 213 Lecture 12: Quick Sort & Radix/Bucket Sort.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Order Statistics. Order statistics Given an input of n values and an integer i, we wish to find the i’th largest value. There are i-1 elements smaller.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Randomized Algorithms CSc 4520/6520 Design & Analysis of Algorithms Fall 2013 Slides adopted from Dmitri Kaznachey, George Mason University and Maciej.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
1 Merge Sort 7 2  9 4   2  2 79  4   72  29  94  4.
Towers of Hanoi Move n (4) disks from pole A to pole B such that a larger disk is never put on a smaller disk A BC ABC.
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
QuickSort by Dr. Bun Yue Professor of Computer Science CSCI 3333 Data.
CSE 250 – Data Structures. Today’s Goals  First review the easy, simple sorting algorithms  Compare while inserting value into place in the vector 
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
Quick-Sort 2/18/2018 3:56 AM Selection Selection.
Quick-Sort 9/12/2018 3:26 PM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
Quick-Sort 9/13/2018 1:15 AM Quick-Sort     2
Divide and Conquer.
Selection Selection 1 Quick-Sort Quick-Sort 10/30/16 13:52
Algorithm Design Methods
Objectives Introduce different known sorting algorithms
Divide-and-Conquer 7 2  9 4   2   4   7
Quick-Sort 2/25/2019 2:22 AM Quick-Sort     2
Copyright © Aiman Hanna All rights reserved
Quick-Sort 4/8/ :20 AM Quick-Sort     2 9  9
CSC 143 Java Sorting.
Quick-Sort 4/25/2019 8:10 AM Quick-Sort     2
Divide & Conquer Sorting
Quick-Sort 5/7/2019 6:43 PM Selection Selection.
Quick-Sort 5/25/2019 6:16 PM Selection Selection.
CS200: Algorithm Analysis
Divide-and-Conquer 7 2  9 4   2   4   7
Chapter 11 Sets, and Selection
Presentation transcript:

LECTURE 40: SELECTION CSC 212 – Data Structures

 Sequence of Comparable elements available  Only care implementation has O(1) access time  Elements unordered within the Sequence  Easy finding smallest & largest elements  What about if we want the k th largest element?  Real function used surprisingly often  Statistical analyses  Database lookups  Ranking teams in league Selection Problem

 Could sort Collection as a first step  Then just return k th element  Sorting is slow  Sorting takes at least O ( n *log n ) time  Ordered Collection faster for selection time  Selection becomes simple O(1) process  O(n) insertion time would also result  Usually this is not a winning tradeoff Selection Problem  k = 

 Works like binary search finds specific value  Come from same family of algorithms  Divide-and-conquer algorithms work similarly  Divide into easier sub-problems to be solved  Recursively solve sub-problems  Conquer by combining solutions to the sub-problems Quick-Select

prune-and-search  Also known as prune-and-search  Read this as divide-and-conquer  Prune: Pick a pivot & split into 2 Collection s  L will get all elements less than pivot  Equal and larger elements go into G  Keep pivot element off to the side  Search: Solve only for Collection with solution  When k < L. size(), continue search in L  pivot is solution when k = L. size()+1  Else, element in G solves this problem Quick-Select

Quick-Select In Pictures x x L G if k < L.size() then return quickSelect(k, L) if k > L.size() then k = k – (L.size() + 1) return quickSelect(k, G) if k == L.size() then return x

Quick-Select Visualization  Draw Collection and k for each recursive call k =5, C =( ) 5 k =2, C =( ) k =2, C =( ) k =1, C =(7 6 5)

Quick-Select Running Time  Each call of the algorithm would take time:  So the worst-case running time:  We would expect the running time to be:

Expected Running Time  Recursive call for Collection of size s  Good call: L & G each have less than ¾ * s elements  Bad call: More than ¾ * s elements for L or G (Note: they cannot both be larger than ¾* s) Good call Bad call

 How often can we expect to make a good call?  Ultimately it will all depend on selection of pivot  ½ of possible pivot s would create good split  So with random guess, get good call 50% of the time Expected Running Time Good pivots Bad pivots

Expected Running Time

Probabilities  Probability Fact #1:  After 2 coin tosses, expect to see heads at least once  Probability Fact #2:  Expectations are additive E(X + Y) = E(X) + E(Y) E(c * X) = c * E(X)

expected Let T ( n ) = expected execution time T(n) < b * n * (# calls before good call) + T(¾ * n) T(n) < b * n * 2 + T(¾ * n) T(n) < b * n * 2 + b * ¾ * n * 2 + T((¾) 2 * n) T(n) < b * n * 2 + b * ¾ * n * 2 + b * (¾) 2 * n * 2 +… T(n) < O(n) + O(n) + O(n) +… … then a mathematical miracle occurs… T ( n ) = O ( n ) More Big-Oh

 Finish week #15 assignment  Due on Friday at 5PM Before Next Lecture…