CS 1114: Sorting and selection (part one) Prof. Graeme Bailey (notes modified from Noah Snavely, Spring 2009)

Slides:



Advertisements
Similar presentations
Growth-rate Functions
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Sorting CMSC 201. Sorting In computer science, there is often more than one way to do something. Sorting is a good example of this!
Sorting and Selection, Part 1 Prof. Noah Snavely CS1114
Chapter 9: Searching, Sorting, and Algorithm Analysis
Computability Start complexity. Motivation by thinking about sorting. Homework: Finish examples.
Sorting and selection – Part 2 Prof. Noah Snavely CS1114
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
CMPS1371 Introduction to Computing for Engineers SORTING.
CPSC 171 Introduction to Computer Science More Efficiency of Algorithms.
Sorting1 Sorting Order in the court!. sorting2 Importance of sorting Sorting a list of values is a fundamental task of computers - this task is one of.
Analysis of Algorithms. Time and space To analyze an algorithm means: –developing a formula for predicting how fast an algorithm is, based on the size.
CS107 Introduction to Computer Science
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
1 Data Structures A program solves a problem. A program solves a problem. A solution consists of: A solution consists of:  a way to organize the data.
Cmpt-225 Algorithm Efficiency.
TDDB56 DALGOPT-D DALG-C Lecture 8 – Sorting (part I) Jan Maluszynski - HT Sorting: –Intro: aspects of sorting, different strategies –Insertion.
CS2420: Lecture 9 Vladimir Kulyukin Computer Science Department Utah State University.
Robustness and speed Prof. Noah Snavely CS1114
Algorithm Efficiency and Sorting
Blobs and Graphs Prof. Noah Snavely CS1114
Data Structure Algorithm Analysis TA: Abbas Sarraf
Analysis of Algorithm.
Elementary Data Structures and Algorithms
CS107 Introduction to Computer Science Lecture 7, 8 An Introduction to Algorithms: Efficiency of algorithms.
Simple Sorting Algorithms. 2 Bubble sort Compare each element (except the last one) with its neighbor to the right If they are out of order, swap them.
Search Lesson CS1313 Spring Search Lesson Outline 1.Searching Lesson Outline 2.How to Find a Value in an Array? 3.Linear Search 4.Linear Search.
COMP s1 Computing 2 Complexity
Analysis of Performance
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Week 2 CS 361: Advanced Data Structures and Algorithms
Recursion, Complexity, and Searching and Sorting By Andrew Zeng.
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
(C) 2010 Pearson Education, Inc. All rights reserved. Java How to Program, 8/e.
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Recursion, Complexity, and Sorting By Andrew Zeng.
Chapter 19: Searching and Sorting Algorithms
10/14/ Algorithms1 Algorithms - Ch2 - Sorting.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Elementary Sorting Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Image segmentation Prof. Noah Snavely CS1114
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Medians and blobs Prof. Ramin Zabih
Robustness and speed Prof. Noah Snavely CS1114
Algorithms IS 320 Spring 2015 Sorting. 2 The Sorting Problem Input: –A sequence of n numbers a 1, a 2,..., a n Output: –A permutation (reordering) a 1.
Finding Red Pixels – Part 2 Prof. Noah Snavely CS1114
Sorting and selection – Part 2 Prof. Noah Snavely CS1114
Searching and Sorting Searching: Sequential, Binary Sorting: Selection, Insertion, Shell.
27-Jan-16 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
Searching Topics Sequential Search Binary Search.
1 Ch. 2: Getting Started. 2 About this lecture Study a few simple algorithms for sorting – Insertion Sort – Selection Sort (Exercise) – Merge Sort Show.
Computer Science 1620 Sorting. cases exist where we would like our data to be in ascending (descending order) binary searching printing purposes selection.
Review 1 Merge Sort Merge Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Sorting Algorithms Written by J.J. Shepherd. Sorting Review For each one of these sorting problems we are assuming ascending order so smallest to largest.
CS 1114: Finding things Prof. Graeme Bailey (notes modified from Noah Snavely, Spring 2009)
Sorting & Lower Bounds Jeff Edmonds York University COSC 3101 Lecture 5.
Growth of Functions & Algorithms
Search Lesson Outline Searching Lesson Outline
CS 1114: Sorting and selection (part three)
Ch 7: Quicksort Ming-Te Chi
Quickselect Prof. Noah Snavely CS1114
Simple Sorting Methods: Bubble, Selection, Insertion, Shell
Sorting, selecting and complexity
CS 1114: Sorting and selection (part two)
Ch. 2: Getting Started.
Quicksort and selection
Discrete Mathematics CS 2610
Sorting and selection Prof. Noah Snavely CS1114
Presentation transcript:

CS 1114: Sorting and selection (part one) Prof. Graeme Bailey (notes modified from Noah Snavely, Spring 2009)

2 Where we are so far  Finding the lightstick –Attempt 1: Bounding box (not so great) –Attempt 2: Centroid isn’t much better –Attempt 3: Trimmed mean Seems promising But how do we compute it quickly? The obvious way doesn’t seem so great… But do we really know this?

How do we define fast?  We want to think about this issue in a way that doesn’t depend on either: A. Getting really lucky input B. Happening to have really fast hardware 3

Recap from last time  We looked at the “trimmed mean” problem for locating the lightstick –Remove 5% of points on all sides, find centroid  This is a version of a more general problem: –Finding the k th largest element in an array –Also called the “selection” problem  We considered an algorithm that repeatedly removes the largest element –How fast is this algorithm? 4

5 A more general version of trimmed mean  Given an array of n numbers, find the k th largest number in the array  Strategy: –Remove the biggest number –Do this k times –The answer is the last number you found

6 How fast is this algorithm?  An elegant answer exists  You will learn it in later CS courses –But I’m going to steal their thunder and explain the basic idea to you here –It’s called “big-O notation”  Two basic principles: –Think about the average / worst case Don’t depend on luck –Think in a hardware-independent way Don’t depend on the chip!

7 Performance of our algorithm  What value of k is the worst case? –k = n –k = n/2  How much work will we do in the worst case? 1. Examine n elements to find the biggest 2. Examine n -1 elements to find the biggest … keep going … n/2. Examine n/2 elements to find the biggest we can easily fix this

8 How much work is this?  How many elements will we examine in total? n + (n – 1) + (n – 2) + … + n/2 = ?  We don’t really care about the exact answer –It’s bigger than (n / 2) 2 n / 2 terms

 The amount of work grows in proportion to n 2  We say that this algorithm is O(n 2 )  [ Blackboard discussion of O(g(n)) and o(g(n)) ] 9 How much work is this?

10 Classes of algorithm speed  Constant time algorithms, O(1) –Do not depend on the input size –Example: find the first element  Linear time algorithms, O( n ) –Constant amount of work for every input item –Example: find the largest element  Quadratic time algorithms, O( n 2 ) –Linear amount of work for every input item –Example: repeatedly removing max element Different hardware only affects the parameters (i.e., line slope)

11 How to do selection better?  If our input were sorted, we can do better –Given 100 numbers in increasing order, we can easily figure out the 5 th biggest or smallest  Very important principle! (encapsulation) –Divide your problem into pieces One person (or group) can provide sort The other person can use sort –As long as both agree on what sort does, they can work independently –Can even “upgrade” to a faster sort

12 How to sort?  Sorting is an ancient problem, by the standards of CS –First important “computer” sort used for 1890 census, by Hollerith (the 1880 census took 8 years, 1890 took just one)  There are many sorting algorithms

How to sort?  Given an array of numbers: [ ]  How can we produce a sorted array? [ ] 13

How to sort?  A concrete version of the problem –Suppose I want to sort all actors by height –How do I do this? 14 …

Sorting, 1 st attempt  Idea: Given n actors 1. Find the shortest actor (D. Devito), put him first 2. Find the shortest actor in the remaining group, put him/her second … Repeat … n. Find the shortest actor in the remaining group (one left), put him/her last 15

Sorting, 1 st attempt  What does this remind you of?  This is called selection sort  After round k, the first k entries are sorted 16 Algorithm 1 1. Find the shortest actor put him first 2. Find the shortest actor in the remaining group, put him/her second … Repeat … n. Find the shortest actor in the remaining group put him/her last Algorithm 1 1. Find the shortest actor put him first 2. Find the shortest actor in the remaining group, put him/her second … Repeat … n. Find the shortest actor in the remaining group put him/her last

Selection sort – pseudocode 17 function [ A ] = selection_sort(A) % Returns a sorted version of array A % by applying selection sort % Uses in place sorting n = length(A); for i = 1:n % Find the smallest element in A(i:n) % Swap that element with something (what?) end

Filling in the gaps  % Find the smallest element in A(i:n)  We pretty much know how to do this m = 10000; m_index = -1; for j in i:n if A(j) < m m = A(j); m_index = j; end 18 [ ] % After round 1, % m = 6, m_index = 4

Filling in the gaps  % Swap the smallest element with something  % Swap element A(m_index) with A(i) A(i) = A(m_index); A(m_index) = A(i); tmp = A(i); A(i) = A(m_index); A(m_index) = tmp; 19 [ ] [ ]

Putting it all together function [ A ] = selection_sort(A) % Returns a sorted version of array A len = length(A); for i = 1:len % Find the smallest element in A(i:len) m = 10000; m_index = -1; for j in i:n if A(j) < m m = A(j); m_index = j; end % Swap element A(m_index) with A(i) tmp = A(i); A(i) = A(m_index); A(m_index) = tmp; end 20

Example of selection sort 21 [ ] [ ] [ ] [ ] [ ]

Speed of selection sort  Let n be the size of the array  How fast is selection sort? O(1) O(n) O(n 2 ) ?  How many comparisons (<) does it do?  First iteration: n comparisons  Second iteration: n – 1 comparisons …  n th iteration: 1 comparison 22

Speed of selection sort  Total number of comparisons: n + (n – 1) + (n – 2) + … + 1  Work grows in proportion to n 2  selection sort is O(n 2 ) 23

Is this the best we can do?  Wait and see !!!!  Don’t forget to spend time in the lab getting help on hw 1 – we have a really fun exercise coming up for the next homework 24