Lecture 16: Quickest Sorting CS150: Computer Science

Slides:



Advertisements
Similar presentations
Class 21: Imperative Programming University of Virginia cs1120 David Evans.
Advertisements

David Evans CS200: Computer Science University of Virginia Computer Science Lecture 13: Of On and Off Grounds Sorting.
Cs1120 Fall 2009 David Evans Lecture 15: Running Practice.
Cs1120 Fall 2009 David Evans Lecture 16: Power Analysis.
Solution of Assignment 3 and Midterm CSC2100B. AVL Tree A binary search tree is a binary tree in which every node has larger key than the nodes in its.
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
David Evans CS150: Computer Science University of Virginia Computer Science Lecture 18: The Story So Far.
© The McGraw-Hill Companies, Inc., Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
David Evans CS200: Computer Science University of Virginia Computer Science Lecture 16: Quicker Sorting.
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
David Evans CS150: Computer Science University of Virginia Computer Science Lecture 14: Asymptotic Growth.
6.001 SICP SICP – September ? 6001-Introduction Trevor Darrell 32-D web page: section.
MergeSort Source: Gibbs & Tamassia. 2 MergeSort MergeSort is a divide and conquer method of sorting.
David Evans CS150: Computer Science University of Virginia Computer Science Lecture 11: 1% Pure Luck Make-up lab hours:
Cs1120 Fall 2009 David Evans Lecture 19: Stateful Evaluation.
David Evans Class 12: Quickest Sorting CS150: Computer Science University of Virginia Computer Science Rose Bush by Jacintha.
David Evans Class 13: Quicksort, Problems and Procedures CS150: Computer Science University of Virginia Computer Science.
David Evans CS200: Computer Science University of Virginia Computer Science Lecture 12: Decrypting Work Circle Fractal.
David Evans CS200: Computer Science University of Virginia Computer Science Lecture 11: CS Logo, by Lincoln Hamilton and.
1 Lecture 16: Lists and vectors Binary search, Sorting.
David Evans Class 20: Quick Sorting CS200: Computer Science University of Virginia Computer Science Queen’s University,
David Evans CS150: Computer Science University of Virginia Computer Science Lecture 12: Something about Sneezewort From.
David Evans CS150: Computer Science University of Virginia Computer Science Lecture 10: Puzzling Pegboards.
Class 8: Recursing on Lists David Evans cs1120 Fall 2009.
Recurrences David Kauchak cs161 Summer Administrative Algorithms graded on efficiency! Be specific about the run times (e.g. log bases) Reminder:
Big Java by Cay Horstmann Copyright © 2009 by John Wiley & Sons. All rights reserved. Selection Sort Sorts an array by repeatedly finding the smallest.
David Evans CS200: Computer Science University of Virginia Computer Science Lecture 12: QuickSorting Queen’s University,
David Evans CS200: Computer Science University of Virginia Computer Science Lecture 19: Environments.
David Evans CS150: Computer Science University of Virginia Computer Science Lecture 9: Of On and Off Grounds Sorting Coffee.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 6: Ordered Data Abstractions
Class 7: List Procedures David Evans cs1120 Fall 2009.
Merge Sort Presentation By: Justin Corpron. In the Beginning… John von Neumann ( ) Stored program Developed merge sort for EDVAC in 1945.
Sorting Lower Bounds n Beating Them. Recap Divide and Conquer –Know how to break a problem into smaller problems, such that –Given a solution to the smaller.
David Evans CS200: Computer Science University of Virginia Computer Science Lecture 14: P = NP?
Merge Sort.
Data Abstraction: Sets
Lecture 17: Environments CS200: Computer Science
Lecture 13: Quicksorting CS200: Computer Science
CS 3343: Analysis of Algorithms
Lecture 7: List Recursion CS200: Computer Science
Week 6 - Wednesday CS221.
Sorting by Tammy Bailey
CS 3343: Analysis of Algorithms
Map interface Empty() - return true if the map is empty; else return false Size() - return the number of elements in the map Find(key) - if there is an.
Monday, April 16, 2018 Announcements… For Today…
MergeSort Source: Gibbs & Tamassia.
Lecture 8: Recursion Practice CS200: Computer Science
Lecture 6: Programming with Data CS150: Computer Science
CS 3343: Analysis of Algorithms
Lecture 11: All Sorts CS200: Computer Science University of Virginia
David Evans Lecture 9: The Great Lambda Tree of Infinite Knowledge and Ultimate Power CS200: Computer Science University.
Lecture 13: Cost of Sorts CS150: Computer Science
8/04/2009 Many thanks to David Sun for some of the included slides!
Lecture #8 מבוא מורחב.
Presentation By: Justin Corpron
Lecture 22: P = NP? CS200: Computer Science University of Virginia
Lecture 10: Quicker Sorting CS150: Computer Science
Searching, Sorting, and Asymptotic Complexity
CS202 - Fundamental Structures of Computer Science II
Lecture 9: The Great Lambda Tree of Knowledge and Power
List and list operations (continue).
Lecture # , , , , מבוא מורחב.
Cs1120 Fall 2009 David Evans Lecture 10: Fracturing Fractals cs1120 Fall 2009 David Evans
Lecture 15: Quicker Sorting CS150: Computer Science
Lecture 11: Sorting Grounds and Bubbles
326 Lecture 9 Henry Kautz Winter Quarter 2002
Lecture 8: Recursing Lists CS150: Computer Science
CS202 - Fundamental Structures of Computer Science II
Presentation transcript:

David Evans http://www.cs.virginia.edu/evans Lecture 16: Quickest Sorting CS150: Computer Science University of Virginia Computer Science David Evans http://www.cs.virginia.edu/evans

Exam 1 Reminders Review Session is tonight at 6:30 in Olsson 228E I have office hours after class today and Thursday at 3:30 Kinga will be in Small Hall Friday morning, 10-11:30am If you have topics you want me to review in Friday’s class, email me

Last class: Sorting Cost (define (best-first-sort lst cf) (if (null? lst) lst (let ((best (find-best lst cf))) (cons best (best-first-sort (delete lst best) cf))))) (define (find-best lst cf) (if (= 1 (length lst)) (car lst) (pick-better cf (car lst) (find-best (cdr lst) cf)))) The running time of best-first-sort is in Θ(n2) where n is the number of elements in the input list. This is wrong!

Length (define (length lst) (if (null? lst) 0 (+ 1 (length (cdr lst))))) The running time of length is in Θ(n) where n is the number of elements in the input list.

find-best Cost (define (find-best lst cf) (if (= 1 (length lst)) (car lst) (pick-better cf (car lst) (find-best (cdr lst) cf)))) assumption: cf is constant time procedure n = number of elements in input list there are n recursive applications of find-best each one involves an application of (length lst) which is in (n) The running time of find-best (using length) is in Θ(n2) where n is the number of elements in the input list.

Sorting Cost This is right (but very inefficient)! (define (best-first-sort lst cf) (if (null? lst) lst (let ((best (find-best lst cf))) (cons best (best-first-sort (delete lst best) cf))))) (define (find-best lst cf) (if (= 1 (length lst)) (car lst) (pick-better cf (car lst) (find-best (cdr lst) cf)))) The running time of best-first-sort is in Θ(n3) where n is the number of elements in the input list. This is right (but very inefficient)!

best-first-sort This is right! (define (best-first-sort lst cf) (if (null? lst) lst (let ((best (find-best lst cf))) (cons best (best-first-sort (delete lst best) cf))))) (define (find-best lst cf) (if (null? (cdr lst)) (car lst) (pick-better cf (car lst) (find-best (cdr lst) cf)))) The running time of best-first-sort is in Θ(n2) where n is the number of elements in the input list. This is right!

Last class: insert-sort (define (insert-sort lst cf) (if (null? lst) null (insert-one (car lst) (insert-sort (cdr lst) cf) cf))) (define (insert-one el lst cf) (if (null? lst) (list el) (if (cf el (car lst)) (cons el lst) (cons (car lst) (insert-one el (cdr lst) cf))))) Assuming cf is a constant time procedure, insert-sort has running time in (n2) where n is the number of elements in the input list

Can we do better? (insert-one < 88 (list 1 2 3 5 6 23 63 77 89 90)) Suppose we had procedures (first-half lst) (second-half lst) that quickly divided the list in two halves?

insert-one using halves (define (insert-one el lst cf) (if (null? lst) (list el) (if (null? (cdr lst)) (if (cf el (car lst)) (cons el lst) (list (car lst) el)) (let ((front (first-half lst)) (back (second-half lst))) (if (cf el (car back)) (append (insert-one el front cf) back) (append front (insert-one el back cf)))))))

Evaluating insert-one > (insert-one < 3 (list 1 2 4 5 7)) |(insert-one #<procedure:traced-<> 3 (1 2 4 5 7)) | (< 3 1) | #f | (< 3 5) | #t | (insert-one #<procedure:traced-<> 3 (1 2 4)) | |(< 3 1) | |#f | |(< 3 4) | |#t | |(insert-one #<procedure:traced-<> 3 (1 2)) | | (< 3 1) | | #f | | (< 3 2) | | (insert-one #<procedure:traced-<> 3 (2)) | | |(< 3 2) | | |#f | | (2 3) | |(1 2 3) | (1 2 3 4) |(1 2 3 4 5 7) (1 2 3 4 5 7) (define (insert-one el lst cf) (if (null? lst) (list el) (if (null? (cdr lst)) (if (cf el (car lst)) (cons el lst) (list (car lst) el)) (let ((front (first-half lst)) (back (second-half lst))) (if (cf el (car back)) (append (insert-one el front cf) back) (append front (insert-one el back cf))))))) Every time we call insert-one, the length of the list is approximately halved!

How much work is insert-one? Each time we call insert-one, the size of lst halves. So, doubling the size of the list only increases the number of calls by 1. (define (insert-one el lst cf) (if (null? lst) (list el) (if (null? (cdr lst)) (if (cf el (car lst)) (cons el lst) (list (car lst) el)) (let ((front (first-half lst)) (back (second-half lst))) (if (cf el (car back)) (append (insert-one el front cf) back) (append front (insert-one el back cf))))))) List Size Number of insert-one applications 1 1 2 2 4 3 8 4 16 5

Remembering Logarithms logb n = x means bx = n What is log2 1024? What is log10 1024? Is log10 n in (log2 n)?

logbn = (1/logkb) logk n Changing Bases If k and b are constants, this is constant (log2n)  (log10n)  (log n) No need to include a constant base within asymptotic operators.

Number of Applications Assuming the list is well-balanced, the number of applications of insert-one is in (log n) where n is the number of elements in the input list.

insert-sort (define (insert-one el lst cf) (if (null? lst) (list el) (if (null? (cdr lst)) (if (cf el (car lst)) (cons el lst) (list (car lst) el)) (let ((front (first-half lst)) (back (second-half lst))) (if (cf el (car back)) (append (insert-one el front cf) back) (append front (insert-one el back cf))))))) (define (insert-sort lst cf) (if (null? lst) null (insert-one (car lst) (insert-sort (cdr lst) cf) cf))) insert-sort using halves would have running time in (n log n) if we had first-half, second-half, and append procedures that run in constant time

Orders of Growth

Is there a fast first-half procedure? No! (at least not on lists) To produce the first half of a list length n, we need to cdr down the first n/2 elements So, first-half has running time in (n)

Making it faster We need to either: Reduce the number of applications of insert-one in insert-sort Reduce the number of applications of insert-one in insert-one Reduce the time for each application of insert-one Impossible – need to consider each element Unlikely…each application already halves the list Need to make first-half, second-half and append faster than (n)

Happy Bird Tree by Jon Morgan and Nadine Natour

el Sorted Binary Trees A tree containing A tree containing left right A tree containing all elements x such that (cf x el) is true A tree containing all elements x such that (cf x el) is false

Tree Example 3 5 cf: < 2 8 4 7 1 null null

Representing Trees (define (make-tree left el right) (cons el (cons left right)) (define (tree-element tree) (car tree)) (define (tree-left tree) (car (cdr tree))) (define (tree-right tree) (cdr (cdr tree))) left and right are trees (null is a tree) tree must be a non-null tree tree must be a non-null tree tree must be a non-null tree

Representing Trees 5 2 8 1 (make-tree (make-tree (make-tree null 1 null) 2 null) 5 (make-tree null 8 null))

insert-one-tree (define (insertel-tree cf el tree) (if (null? tree) (make-tree null el null) (if (cf el (get-element tree)) (make-tree (insert-one-tree cf el (get-left tree)) (get-element tree) (get-right tree)) (get-left tree) (insert-one-tree cf el (get-right tree)))))) If the tree is null, make a new tree with el as its element and no left or right trees. Otherwise, decide if el should be in the left or right subtree. insert it into that subtree, but leave the other subtree unchanged.

How much work is insert-one-tree? (define (insert-one-tree cf el tree) (if (null? tree) (make-tree null el null) (if (cf el (get-element tree)) (make-tree (insertel-tree cf el (get-left tree)) (get-element tree) (get-right tree)) (make-tree (get-left tree) (get-element tree) (insertel-tree cf el (get-right tree)))))) Each time we call insert-one-tree, the size of the tree approximately halves (if it is well balanced). Each application is constant time. The running time of insertel-tree is in  (log n) where n is the number of elements in the input tree, which must be well-balanced.

insert-sort-helper (define (insert-sort-helper cf lst) (if (null? lst) null (insert-one-tree cf (car lst) (insert-sort-helper cf (cdr lst))))) No change (other than using insert-one-tree)…but evaluates to a tree not a list! (((() 1 ()) 2 ()) 5 (() 8 ()))

Charge Exam 1 is out Friday, due Monday Exam Review, Wednesday 6:30 in Olsson 228E