Sorting. How fast can we sort? All the sorting algorithms we have seen so far are comparison sorts: only use comparisons to determine the relative order.

Slides:



Advertisements
Similar presentations
Comp 122, Spring 2004 Binary Search Trees. btrees - 2 Comp 122, Spring 2004 Binary Trees  Recursive definition 1.An empty tree is a binary tree 2.A node.
Advertisements

Jan Binary Search Trees What is a search binary tree? Inorder search of a binary search tree Find Min & Max Predecessor and successor BST insertion.
Analysis of Algorithms CS 477/677 Binary Search Trees Instructor: George Bebis (Appendix B5.2, Chapter 12)
Binary Search Trees Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Introduction to Algorithms Jiafen Liu Sept
CS 332: Algorithms Binary Search Trees. Review: Dynamic Sets ● Next few lectures will focus on data structures rather than straight algorithms ● In particular,
ALGORITHMS THIRD YEAR BANHA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATIC Lecture six Dr. Hamdy M. Mousa.
The complexity and correctness of algorithms (with binary trees as an example)
UNC Chapel Hill Lin/Foskey/Manocha Binary Search Tree Her bir node u bir object olan bir linked data structure ile temsil edilebilir. Her bir node key,
Chapter 12 Binary search trees Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from.
David Luebke 1 5/4/2015 Binary Search Trees. David Luebke 2 5/4/2015 Dynamic Sets ● Want a data structure for dynamic sets ■ Elements have a key and satellite.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Binary Search Trees Comp 550.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 10.
2IL50 Data Structures Spring 2015 Lecture 8: Augmenting Data Structures.
1 Brief review of the material so far Recursive procedures, recursive data structures –Pseudocode for algorithms Example: algorithm(s) to compute a n Example:
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 12.
Binary Search Trees CIS 606 Spring Search trees Data structures that support many dynamic-set operations. – Can be used as both a dictionary and.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11.
Analysis of Algorithms CS 477/677 Red-Black Trees Instructor: George Bebis (Chapter 14)
Universal Hashing When attempting to foil an malicious adversary, randomize the algorithm Universal hashing: pick a hash function randomly when the algorithm.
1.1 Data Structure and Algorithm Lecture 12 Binary Search Trees Topics Reference: Introduction to Algorithm by Cormen Chapter 13: Binary Search Trees.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE Binary search trees Motivation Operations on binary search trees: –Search –Minimum,
David Luebke 1 7/2/2015 ITCS 6114 Binary Search Trees.
David Luebke 1 7/2/2015 Medians and Order Statistics Structures for Dynamic Sets.
DAST 2005 Tirgul 7 Binary Search Trees. DAST 2005 Motivation We would like to have a dynamic ADT that efficiently supports the following common operations:
12.Binary Search Trees Hsu, Lih-Hsing. Computer Theory Lab. Chapter 12P What is a binary search tree? Binary-search property: Let x be a node in.
Design & Analysis of Algorithms Unit 2 ADVANCED DATA STRUCTURE.
David Luebke 1 9/18/2015 CS 332: Algorithms Red-Black Trees.
Chapter 12. Binary Search Trees. Search Trees Data structures that support many dynamic-set operations. Can be used both as a dictionary and as a priority.
CS 3343: Analysis of Algorithms Lecture 16: Binary search trees & red- black trees.
CSC 41/513: Intro to Algorithms Linear-Time Sorting Algorithms.
Introduction to Algorithms Jiafen Liu Sept
2IL50 Data Structures Fall 2015 Lecture 7: Binary Search Trees.
Binary SearchTrees [CLRS] – Chap 12. What is a binary tree ? A binary tree is a linked data structure in which each node is an object that contains following.
Binary Search Tree Qamar Abbas.
October 3, Algorithms and Data Structures Lecture VII Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Preview  Graph  Tree Binary Tree Binary Search Tree Binary Search Tree Property Binary Search Tree functions  In-order walk  Pre-order walk  Post-order.
Lecture 9 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
1 Algorithms CSCI 235, Fall 2015 Lecture 22 Binary Search Trees.
Red-Black Trees. Review: Binary Search Trees ● Binary Search Trees (BSTs) are an important data structure for dynamic sets ● In addition to satellite.
Lecture 19. Binary Search Tree 1. Recap Tree is a non linear data structure to present data in hierarchical form. It is also called acyclic data structure.
Fundamentals of Algorithms MCS - 2 Lecture # 17. Binary Search Trees.
Lecture 91 Data Structures, Algorithms & Complexity Insertion and Deletion in BST GRIFFITH COLLEGE DUBLIN.
Mudasser Naseer 1 1/25/2016 CS 332: Algorithms Lecture # 10 Medians and Order Statistics Structures for Dynamic Sets.
Analysis of Algorithms CS 477/677 Red-Black Trees Instructor: George Bebis (Chapter 14)
CSE 2331/5331 Topic 8: Binary Search Tree Data structure Operations.
CSC317 1 Binary Search Trees (binary because branches either to left or to right) Operations: search min max predecessor successor. Costs? Time O(h) with.
CS6045: Advanced Algorithms Data Structures. Dynamic Sets Next few lectures will focus on data structures rather than straight algorithms In particular,
CS6045: Advanced Algorithms Sorting Algorithms. Sorting So Far Insertion sort: –Easy to code –Fast on small inputs (less than ~50 elements) –Fast on nearly-sorted.
Binary Search Trees What is a binary search tree?
Binary Search Trees.
CS 332: Algorithms Red-Black Trees David Luebke /20/2018.
CS200: Algorithms Analysis
Lecture 7 Algorithm Analysis
Ch. 12: Binary Search Trees Ming-Te Chi
Red-Black Trees.
Ch. 12: Binary Search Trees Ming-Te Chi
Lecture 7 Algorithm Analysis
Algorithms and Data Structures Lecture VII
Chapter 12: Binary Search Trees
CS6045: Advanced Algorithms
Lecture 7 Algorithm Analysis
Topic 6: Binary Search Tree Data structure Operations
Design and Analysis of Algorithms
Analysis of Algorithms CS 477/677
Binary Search Trees Comp 122, Spring 2004.
Chapter 12&13: Binary Search Trees (BSTs)
Red-Black Trees CS302 Data Structures
Presentation transcript:

Sorting

How fast can we sort? All the sorting algorithms we have seen so far are comparison sorts: only use comparisons to determine the relative order of elements. The best worst-case running time that we’ve seen for comparison sorting is O(nlgn). – Is O(nlgn)the best we can do? Decision treescan help us answer this question. E.g., insertion sort, merge sort, quicksort, heapsort.

Decision-tree example Sort 〈 a1, a2, …, an 〉 Each internal node is labeled i:j for i, j ∈ {1, 2,…, n}. The left subtree shows subsequent comparisons if ai≤aj. The right subtree shows subsequent comparisons if ai≥aj.

Sort 〈 a1, a2, a3 〉 = 〈 9, 4, 6 〉 :

Sort 〈 a1, a2, a3 〉 = 〈 9, 4, 6 〉 :

Sort 〈 a1, a2, a3 〉 = 〈 9, 4, 6 〉 :

Each leaf contains a permutation 〈 π(1), π(2),…, π(n) 〉 to indicate that the ordering aπ(1 ) ≤aπ(2)≤...≤aπ(n)has been established. Sort 〈 a1, a2, a3 〉 = 〈 9, 4, 6 〉 :

Decision-tree model A decision tree can model the execution of any comparison sort: One tree for each input size n. (there is specific tree for different n values tree is not generic) View the algorithm as splitting whenever it compares two elements. The tree contains the comparisons along all possible instruction traces. The running time of the algorithm =the length of the path taken. Worst-case running time =height of tree.

Lower bound for decision-tree sorting Theorem. Any decision tree that can sort n elements must have height Ω(nlgn). Proof. The tree must contain ≥n! leaves, since there are n! possible permutations. A height-h binary tree has ≤2 h leaves. – Thus, n! ≤2 h. h≥lg(n!) (lg is mono. İncreasing ) ≥lg ((n/e) n )(Stirling’s formula) = nl(g n–lg e) lg e is constant = Ω(nlg n) monotonically increasing,increasing if for all x and y such that x ≤ y one has f(x) ≤ f(y), so f preserves the order

Lower bound for comparison sorting Corollary Heapsort and merge sort are asymptotically optimal comparison sorting algorithms.

Counting Sort 1CountingSort(A, B, k) 2for i=1 to k 3C[i]= 0; 4for j=1 to n 5C[A[j]] += 1; 6for i=2 to k 7C[i] = C[i] + C[i-1]; 8for j=n downto 1 9B[C[A[j]]] = A[j]; 10C[A[j]] -= 1; What will be the running time? Takes time O(k) Takes time O(n)

Analysis

Running time If k= O(n), then counting sort takes Θ(n)time. But, sorting takes Ω(nlgn) time! Answer: Comparison sorting takes Ω(nlgn)time. Counting sort is not a comparison sort. In fact, not a single comparison between elements occurs!

Why don’t we always use counting sort? Because it depends on range k of elements Could we use counting sort to sort 32 bit integers? Why or why not? Answer: no, k too large (2 32 = 4,294,967,296)

Stable sorting Counting sort is a stablesort: it preserves the input order among equal elements.

Dynamic sets Dynamic sets Sets that can grow, shrink, or otherwise change over time. Two types of operations: – queries return information about the set – modifying operationschange the set Common queries Search, Minimum, Maximum, Successor, Predecessor Common modifying operations Insert, Delete

Dictionary Dictionary Stores a set S of elements, each with an associated key (integer value). Operations Search(S, k): return a pointer to an element x in S with key[x] = k, or NIL if such an element does not exist. Insert(S, x): inserts element x into S, that is, S ← S ⋃ {x} Delete(S, x): remove element x from S

SearchInsertDelete linked list sorted array hash table Θ(1) Θ(n) Θ(log n) Implementing a dictionary Hash table Running times are average times and assume (simple) uniform hashing and a large enough table (for example, of size 2n) TodayBinary search trees Θ(1)

Binary search trees Binary search trees are an important data structure for dynamic sets. They can be used as both a dictionary and a priority queue. They accomplish many dynamic-set operations in O(h) time, where h = height of tree.

5 Binary search trees root of T denoted by root[T] internal nodes have four fields: key (and possible other satellite data) left: points to left child right: points to right child p: points to parent. p[root[T]] = NIL Binary tree T root[T] = NIL

Binary search trees Binary tree T root[T] Keys are stored only in internal nodes! There are binary search trees which store keys only in the leaves …

Binary search trees a binary tree is – a leaf or – a root node x with a binary tree as its left and/or right child Binary-search-tree property – if y is in the left subtree of x, then key[y] ≤ key[x] – if y is in the right subtree of x, then key[y] ≥ key[x] Keys don’t have to be unique …

Binary search trees height h = 2 height h = 4 Binary-search-tree property – if y is in the left subtree of x, then key[y] ≤ key[x] – if y is in the right subtree of x, then key[y] ≥ key[x]

Inorder tree walk binary search trees are recursive structures ➨ recursive algorithms often work well Example: print all keys in order using an inorder tree walk InorderTreeWalk(x) 1. if x ≠ NIL 2. then InorderTreeWalk(left[x]) 3. print key[x] 4. InorderTreeWalk(right[x]) Correctness: follows by induction from the binary search tree property Running time? Intuitively, O(n) time for a tree with n nodes, since we visit and print each node once

Inorder tree walk InorderTreeWalk(x) 1. if x ≠ NIL 2. then InorderTreeWalk(left[x]) 3. print key[x] 4. InorderTreeWalk(right[x]) Theorem If x is the root of an n-node subtree, then the call InorderTreeWalk(x) takes Θ(n) time. Proof: – T(n) takes small, constant amount of time on empty subtree ➨ T(0) = c for some positive constant c – for n > 0 assume that left subtree has k nodes, right subtree n-k-1 ➨ T(n) = T(k) + T(n-k-1) + d for some positive constant d use substitution method … to show: T(n) = (c+d)n + c ■

Querying a binary search tree TreeSearch(x, k) 1. if x = NIL or k = key[x] 2. then return x 3. if k < key[x] 4. then return TreeSearch(left[x], k) 5. else return TreeSearch(right[x], k) Initial call: TreeSearch(root[T], k) – TreeSearch(root[T], 4) – TreeSearch (root[T], 2) Running time: Θ(length of search path) worst case Θ(h) k ≤ k≥ k Binary-search-tree property

Querying a binary search tree – iteratively TreeSearch(x, k) 1. if x = NIL or k = key[x] 2. then return x 3. if k < key[x] 4. then return TreeSearch(left[x], k) 5. else return TreeSearch(right[x], k) IterativeTreeSearch(x, k) 1. while x ≠ NIL and k ≠ key[x] 2. do if k < key[x] 3. then x ← left[x] 4. else x ← right[x] 5. return x iterative version more efficient on most computers k ≤ k≥ k Binary-search-tree property

Minimum and maximum binary-search-tree property guarantees that – the minimum key is located in the leftmost node – the maximum key is located in the rightmost node TreeMinimum(x) 1. while left[x] ≠ NIL 2. do x ← left[x] 3. return x TreeMaximum(x) 1. while right[x] ≠ NIL 2. do x ← right[x] 3. return x k ≤ k≥ k Binary-search-tree property

Minimum and maximum binary-search-tree property guarantees that – the minimum key is located in the leftmost node – the maximum key is located in the rightmost node TreeMinimum(x) 1. while left[x] ≠ NIL 2. do x ← left[x] 3. return x TreeMaximum(x) 1. while right[x] ≠ NIL 2. do x ← right[x] 3. return x k ≤ k≥ k Binary-search-tree property Running time? Both procedures visit nodes on a downward path from the root ➨ O(h) time

Successor and predecessor Assume that all keys are distinct Successor of a node x node y such that key[y] is the smallest key > key[x] We can find y based entirely on the tree structure, no key comparisons are necessary … if x has the largest key, then we say x’s successor is NIL

Successor and predecessor Successor of a node x node y such that key[y] is the smallest key > key[x] Two cases: 1.x has a non-empty right subtree ➨ x’s successor is the minimum in x’s right subtree 2.x has an empty right subtree ➨ x’s successor y is the node that x is the predecessor of (x is the maximum in y’s left subtree) as long as we move to the left up the tree (move up through right children), we’re visiting smaller keys …

Successor and predecessor TreeSucessor(x) 1. if right[x] ≠ NIL 2. then return TreeMinimum(right[x]) 3. y ← p[x] 4. while y ≠ NIL and x = right[y] 5. do x ← y 6. y ← p[x] 7. return y – Successor of 15? – Successor of 6? – Successor of 4?

Successor and predecessor TreeSucessor(x) 1. if right[x] ≠ NIL 2. then return TreeMinimum(right[x]) 3. y ← p[x] 4. while y ≠ NIL and x = right[y] 5. do x ← y 6. y ← p[x] 7. return y Running time? O(h)

Insertion TreeInsert(T, z) 1. y ← NIL 2. x ← root[T] 3. while x ≠ NIL 4. do y ← x 5. if key[z] < key[x] 6. then x ← left[x] 7. else x ← right[x] 8. p[z] ← y 9. if y = NIL 10. then root[T] ← x 11. else if key[z] < key[y] 12. then left[y] ← z 13. else right[y] ← z to insert value v, insert node z with key[z] = v, left[z] = NIL, and right[z] = NIL traverse tree down to find correct position for z

Insertion TreeInsert(T, z) 1. y ← NIL 2. x ← root[T] 3. while x ≠ NIL 4. do y ← x 5. if key[z] < key[x] 6. then x ← left[x] 7. else x ← right[x] 8. p[z] ← y 9. if y = NIL 10. then root[T] ← x 11. else if key[z] < key[y] 12. then left[y] ← z 13. else right[y] ← z to insert value v, insert node z with key[z] = v, left[z] = NIL, and right[z] = NIL traverse tree down to find correct position for z z x y = NIL y xy xy xy x

Insertion TreeInsert(T, z) 1. y ← NIL 2. x ← root[T] 3. while x ≠ NIL 4. do y ← x 5. if key[z] < key[x] 6. then x ← left[x] 7. else x ← right[x] 8. p[z] ← y 9. if y = NIL 10. then root[T] ← z 11. else if key[z] < key[y] 12. then left[y] ← z 13. else right[y] ← z Running time? to insert value v, insert node z with key[z] = v, left[z] = NIL, and right[z] = NIL traverse tree down to find correct position for z z y O(h)

7 Deletion we want to delete node z TreeDelete has three cases: z has no children ➨ delete z by having z’s parent point to NIL, instead of to z

18 Deletion we want to delete node z TreeDelete has three cases: z has no children ➨ delete z by having z’s parent point to NIL, instead of to z z has one child ➨ delete z by having z’s parent point to z’s child, instead of to z

we want to delete node z TreeDelete has three cases: z has no children ➨ delete z by having z’s parent point to NIL, instead of to z z has one child ➨ delete z by having z’s parent point to z’s child, instead of to z z has two children ➨ z’s successor y has either no or one child ➨ delete y from the tree and replace z’s key and satellite data with y’s 6 7 Deletion

we want to delete node z TreeDelete has three cases: z has no children ➨ delete z by having z’s parent point to NIL, instead of to z z has one child ➨ delete z by having z’s parent point to z’s child, instead of to z z has two children ➨ z’s successor y has either no or one child ➨ delete y from the tree and replace z’s key and satellite data with y’s Running time? 7 Deletion O(h)

Minimizing the running time all operations can be executed in time proportional to the height h of the tree (instead of proportional to the number n of nodes in the tree) Worst case: Solution: guarantee small height (balance the tree) ➨ h = Θ(log n) Next week: several schemes to balance binary search trees Θ(n)

Balanced binary search trees Advantages of balanced binary search trees over linked lists efficient search in Θ(log n) time over sorted arrays efficient insertion and deletion in Θ(log n) time over hash tables can find successor and predecessor efficiently in Θ(log n) time

SearchInsertDelete linked list sorted array hash table binary search tree Θ(1) Θ(n) Θ(log n) Implementing a dictionary balanced binary search trees ➨ height is Θ(log n) Θ(1) Θ(height)