1 Lecture 16: Lists and vectors Binary search, Sorting.

Slides:



Advertisements
Similar presentations
Recursion Chapter 14. Overview Base case and general case of recursion. A recursion is a method that calls itself. That simplifies the problem. The simpler.
Advertisements

Introduction to Algorithms Quicksort
AVL Trees1 Part-F2 AVL Trees v z. AVL Trees2 AVL Tree Definition (§ 9.2) AVL trees are balanced. An AVL Tree is a binary search tree such that.
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
© The McGraw-Hill Companies, Inc., Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
Lectures on Recursive Algorithms1 COMP 523: Advanced Algorithmic Techniques Lecturer: Dariusz Kowalski.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Using Divide and Conquer for Sorting
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Spring 2015 Lecture 5: QuickSort & Selection
Sorting Algorithms. Motivation Example: Phone Book Searching Example: Phone Book Searching If the phone book was in random order, we would probably never.
CMPS1371 Introduction to Computing for Engineers SORTING.
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
Recitation 9 Programming for Engineers in Python.
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
CS 104 Introduction to Computer Science and Graphics Problems Data Structure & Algorithms (3) Recurrence Relation 11/11 ~ 11/14/2008 Yang Song.
Analysis of Algorithms CS 477/677
The Complexity of Algorithms and the Lower Bounds of Problems
CSE 373 Data Structures Lecture 19
Computer Algorithms Lecture 10 Quicksort Ch. 7 Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
COMP s1 Computing 2 Complexity
Recursion, Complexity, and Searching and Sorting By Andrew Zeng.
Recursion, Complexity, and Sorting By Andrew Zeng.
Computer Science 101 Fast Searching and Sorting. Improving Efficiency We got a better best case by tweaking the selection sort and the bubble sort We.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
Order Statistics. Order statistics Given an input of n values and an integer i, we wish to find the i’th largest value. There are i-1 elements smaller.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
CSC 211 Data Structures Lecture 13
Sorting CS 110: Data Structures and Algorithms First Semester,
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
CS 61B Data Structures and Programming Methodology July 21, 2008 David Sun.
Sorting CSIT 402 Data Structures II. 2 Sorting (Ascending Order) Input ›an array A of data records ›a key value in each data record ›a comparison function.
Searching and Sorting Recursion, Merge-sort, Divide & Conquer, Bucket sort, Radix sort Lecture 5.
1 CSE 326: Data Structures A Sort of Detour Henry Kautz Winter Quarter 2002.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Computer Science 101 Fast Algorithms. What Is Really Fast? n O(log 2 n) O(n) O(n 2 )O(2 n )
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Sorting Fundamental Data Structures and Algorithms Aleks Nanevski February 17, 2004.
1 Recursive algorithms Recursive solution: solve a smaller version of the problem and combine the smaller solutions. Example: to find the largest element.
CSE 326: Data Structures Lecture 23 Spring Quarter 2001 Sorting, Part 1 David Kaplan
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.
Sorting Lower Bounds n Beating Them. Recap Divide and Conquer –Know how to break a problem into smaller problems, such that –Given a solution to the smaller.
Sorting & Searching Geletaw S (MSC, MCITP). Objectives At the end of this session the students should be able to: – Design and implement the following.
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
INTRO2CS Tirgul 8 1. Searching and Sorting  Tips for debugging  Binary search  Sorting algorithms:  Bogo sort  Bubble sort  Quick sort and maybe.
1 Vectors, binary search, and sorting. 2 We know about lists O(n) time to get the n-th item. Consecutive cons cell are not necessarily consecutive in.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Advanced Sorting.
Fundamental Data Structures and Algorithms
Quick-Sort 9/12/2018 3:26 PM Presentation for use with the textbook Data Structures and Algorithms in Java, 6th edition, by M. T. Goodrich, R. Tamassia,
Teach A level Computing: Algorithms and Data Structures
Algorithm Design Methods
CS 3343: Analysis of Algorithms
Chapter 4: Divide and Conquer
CSC215 Lecture Algorithms.
8/04/2009 Many thanks to David Sun for some of the included slides!
CSE 373 Data Structures and Algorithms
CS 1114: Sorting and selection (part two)
CSE 373 Data Structures and Algorithms
Quick-Sort 4/25/2019 8:10 AM Quick-Sort     2
The Selection Problem.
Divide-and-Conquer 7 2  9 4   2   4   7
Presentation transcript:

1 Lecture 16: Lists and vectors Binary search, Sorting

2 How can we implement a pair? Each pair consists of two consecutive memory locations. The first holds a pointer to the car The second holds a pointer to the cdr Data is typed, so we know if we have a pointer or not.

3 Vectors – storing lists differently We can directly access the k’th element of the list. base We have a pointer to the address base where the lists starts. We assume each element takes the same amount of space (say 1) The first element is at address base The second element is at address base+1 The k’th element is at address base+(k-1)

4 Lists vs. Vectors On the one hand: We can directly access the k’th element of a vector. O(1) time. We need k operations to access the k’th element of a list. On the other hand: Lists are more flexible. We can add/delete elements in between. We can reverse the order of pointers, shuffle them,… Conclusion: Different things suit different purposes.

5 The vector interface. Methods: (vector? obj) ; returns true if obj is a vector, false otherwise (vector-length vec) ; returns the length of the vector vec (vector-ref vec k) ; returns the element of vec with index k ; k should be between 0 and length-1 Constructors: (vector... ) ; create a vector from 0 or more elements (make-vector k) ; create a vector of length k (make-vector k init) ; same, but each element is initialized to init Mutator: (vector-set! vec k elt) ; stores elt in the element k of the ; vector vec. k should be between 0 and length-1

6 Examples (define vec1 (vector 'anna)) vec1 ==> #4(9 7 4 anna) (vector-length vec1) ==> 4 (vector-ref vec1 1) ==> 7 (vector-set! vec1 2 'foo) vec1 ==> #(9 7 ‘foo ‘anna) (define vec2 (make-vector 4 '())) vec2 ==> #4(() () () ()) (vector-set! vec2 2 (vector-ref vec1 2)) vec2 ==> #(() () foo ()) (define list_of_two_vecs (list vec1 vec2)) (vector? (car list_of_two_vecs)) ==> #t

7 Binary search We are given: a sorted vector vec, and An element elm We want to know whether elm is in vec. Example: We want to check whether elm = גולית הפלישתי Appears in the Telephone book.

8 Searching for גולית Alternatively: We can use the fact that the phone book is sorted. We first look for the letter ג Then for the letter ו And so on. Impatient people don’t have time to think about the problem, They start reading the phone book from the beginning Until they fall asleep. But how do we find where does ג starts?

9 Binary search Compare elm to the element x that is at the middle of the vector. If elm=x, we have found elm If elm<x, we recursively check in the first half of the vector. If elm>x, we recursively check in the second half of the vector. We open the phone book in the middle, And we adaptively decide in which half to continue the search. Then solve recursively.

10 Complexity With one comparison we reduce search space to half its size, therefore, the run time is  (log n) We require: An order relation on the elements. The input vector must be sorted Random access to the vector.

11 Implementation (define (bin-search vec elm) (define (search left right) (if (> left right) nil (let* ((mid (average left right)) (mid-item (vector-ref vec mid))) (cond ((= elm mid-item) mid) ((< elm mid-item) (search left (- mid 1))) (else (search (+ mid 1) right)))))) (search 0 (- (vector-length vec) 1))) (define (average x y) (round (/ (+ x y) 2)))

12 Sorting Input: A collection of data compound elements, each data element has a key from some ordered domain. For our example there will be just keys. Output: The same elements ordered according to the keys in increasing order. As we saw, sorting can be sometimes useful, e.g., for searching in a phone book. But how do we create the phone book?

13 Bubble sort Basic step: The algorithm compares neighbors, and exchanges them when they are out of order. A round: Go through the elements from last to first, performing this check on each pair of neighbors. Run n rounds. Correctness: After the first round, the smallest element is first. After the second round, the first two elements are correct And so on.

14 Bubble sort First round Second round

15 Implementation of Bubble sort (define (bubble-sort vec) (define n (vector-length vec)) (define (iter i) (define (bubble j) (if (>= j i) (let ((prev (vector-ref vec (- j 1))) (cur (vector-ref vec j))) (cond ((> prev cur) (vector-set! vec (- j 1) cur) (vector-set! vec j prev))) (bubble (- j 1))))) (cond ((< i n) (bubble (- n 1)) (iter (+ i 1))))) (iter 1))

16 The complexity of Bubble sort Number of comparisons is: (n-1) + (n-2) = n*(n-1)/2 =  (n 2 ). To sort TA phone book, we need about (10^7)^2 = 10^14 operations. By the time the phone book is ready we might need a new one.

17 Another approach Start with an empty list. Always maintain a sorted (sub)-list. Each time add another element to the right location in the list. If each insert takes O(log n) To sort a list would take O(n log n) Problem: With vectors we can find the right place in O(log n), but we can not insert in the middle. With lists we can insert in the middle, but we can not find the right location fast. The approach can be made to work using more sophisticated data structures.

18 Divide and Conquer Facts of life: Solving a 5,000 piece puzzle is difficult. Solving a 1,000 piece puzzle is doable. Solving a 500 piece puzzle is easy. Solving a 1 piece puzzle is trivial. One way to solve a 5,000 piece puzzle Find a way to divide the 5,000 pieces to five groups that are solved separately. Solve each separately, and combine the five solutions to get a solution for the whole puzzle. For example: all the pieces on the border have a straight line, Take them alone and build the border. all the pieces of the windmill have brown in them.

19 Divide and conquer If we: 1.Know how to break a problem into smaller problems, such that 2.Given a solution to the smaller problems we can easily find A solution to the bigger problem Then we get an algorithm: 1. Divide and solve the smaller pieces recursively 2. Solve the big problem using the solutions to the smaller problems.

20 Quicksort The algorithm: 1.chooses some element, called pivot. 2.Splits the elements into three groups: smaller the pivot equal to pivot larger than pivot. 3. Recursively sort the smaller and the larger groups independently

21 Quicksort (Cont.) (define (quicksort lst) (if (null? lst) nil (let ((pivot (car lst))) (let ((low (filter (lambda (x) (< x pivot)) lst)) (high (filter (lambda (x) (> x pivot)) lst)) (same (filter (lambda (x) (= x pivot)) lst))) (append (append (quicksort low) same) (quicksort high))))))

22 Quicksort -- brief analysis T(n) <= cn + T(m) + T(n-m-1) where m is the number of elements smaller than the pivot. n m n-m-1  cn  c(n-1)  c(n-2) So the running time is O(d n) Where d is the depth of the tree.

23 Quicksort -- brief analysis Best case: when at each stage m is about n/2, i.e., the algorithm partitions the elements into two sets of about equal size. This gives depth O(log n) and time T(n)=O(n log n) Worst case: when m=0 (or m=n-1), giving depth n and run time T(n) =  (n 2 ) Average case: Average on what? How do we pick the pivot? Picking it “correctly”, “on average” it’s fast.

24 Merge Sort Another divide and conquer algorithm. Input: A list with n elements. Algorithm: 1.Divide the list into the first half and the second half. 2.Recursively Sort each half. 3.Merge the two sorted lists.

25 Merge Input: Two sorted lists (of may be different lengths) Output: One sorted list, with the same set of elements as the union of the two lists. Algorithm: Put a pointer at the beginning of each list. At each step pick the smallest element, and advance that pointer to the next element in its list.

26 Implementing merge (define (merge list1 list2) (cond ((null? list1) list2) ((null? list2) list1) ((< (car list1) (car list2)) (cons (car list1) (merge (cdr list1) list2))) (else (cons (car list2) (merge list1 (cdr list2)))))) Complexity: linear in the size of the lists.

27 Running time T(n) = 2 ( 2 T(n/4) + cn/2) + cn = 4 T(n/4) + cn + cn = 8 T(n/8) + cn + cn + cn = 2 k T(n/2 k ) + k cn = c n log(n) T(n) = 2T(n/2) + cn T(2) = c Worst case.

28 Can we do better? Theorem: Any sorting algorithms that is based on comparisons requires  (n log n) Comparisons. Proof idea: There are n! possible orderings. We need to distinguish them all. Thus we need about log(n!) comparisons and that’s  (n log n). Making it precise: well, not now.

29 Q: We want to sort 52 cards, We don’t like the log(52) factor. Can we sort cards better? Yes! but not based on comparisons. Radix sort: Input: n elements between 1 to s. Output: The elements sorted. Algorithm: Create a vector of length s, initialized with 0 everywhere. Go over the input, and for each element, if its value is k, increase vec[k] by 1. Complexity: O(n+s)

30 Implementing Radix Sort. (define (radix-sort lst m) ; sorts a list of elements between 0..m (define vec (make-vector (+ m 1) 0)) (define (help-func sub-list) (cond ((null? sub-list) 'done) (else (let* ((elm (car sub-list)) (val (vector-ref vec elm))) (vector-set! vec elm (+ val 1)) (help-func (cdr sub-list)))))) (help-func lst) vec) (radix-sort (list ) 10)  #11( )