Sorting by placement and Shift Sergi Elizalde Peter Winkler By 資工四 B95902098 周于荃.

Slides:



Advertisements
Similar presentations
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Advertisements

Introduction to Algorithms Quicksort
Incremental Linear Programming Linear programming involves finding a solution to the constraints, one that maximizes the given linear function of variables.
Longest Common Subsequence
Walks, Paths and Circuits Walks, Paths and Circuits Sanjay Jain, Lecturer, School of Computing.
Recursive Definitions and Structural Induction
22C:19 Discrete Structures Trees Spring 2014 Sukumar Ghosh.
Sorting Comparison-based algorithm review –You should know most of the algorithms –We will concentrate on their analyses –Special emphasis: Heapsort Lower.
MS 101: Algorithms Instructor Neelima Gupta
Binary Searching.
Determinization of Büchi Automata
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Outline. Theorem For the two processor network, Bit C(Leader) = Bit C(MaxF) = 2[log 2 ((M + 2)/3.5)] and Bit C t (Leader) = Bit C t (MaxF) = 2[log 2 ((M.
Section 7.4: Closures of Relations Let R be a relation on a set A. We have talked about 6 properties that a relation on a set may or may not possess: reflexive,
The number of edge-disjoint transitive triples in a tournament.
Chapter 19: Searching and Sorting Algorithms
Yangjun Chen 1 Bipartite Graphs What is a bipartite graph? Properties of bipartite graphs Matching and maximum matching - alternative paths - augmenting.
CPSC 668Set 5: Synchronous LE in Rings1 CPSC 668 Distributed Algorithms and Systems Spring 2008 Prof. Jennifer Welch.
1 Introduction to Computability Theory Lecture12: Reductions Prof. Amos Israeli.
Ch. 7 - QuickSort Quick but not Guaranteed. Ch.7 - QuickSort Another Divide-and-Conquer sorting algorithm… As it turns out, MERGESORT and HEAPSORT, although.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
Definition Hamiltonian graph: A graph with a spanning cycle (also called a Hamiltonian cycle). Hamiltonian graph Hamiltonian cycle.
Yangjun Chen 1 Bipartite Graph 1.A graph G is bipartite if the node set V can be partitioned into two sets V 1 and V 2 in such a way that no nodes from.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
Normal forms for Context-Free Grammars
Analysis of Algorithms CS 477/677
1 Separator Theorems for Planar Graphs Presented by Shira Zucker.
1 Section 6.1 Recurrence Relations. 2 Recursive definition of a sequence Specify one or more initial terms Specify rule for obtaining subsequent terms.
Variable-Length Codes: Huffman Codes
DAST 2005 Week 4 – Some Helpful Material Randomized Quick Sort & Lower bound & General remarks…
Matrix Algebra THE INVERSE OF A MATRIX © 2012 Pearson Education, Inc.
NP-complete and NP-hard problems. Decision problems vs. optimization problems The problems we are trying to solve are basically of two kinds. In decision.
Induction and recursion
Ch. 8 & 9 – Linear Sorting and Order Statistics What do you trade for speed?
Copyright © Cengage Learning. All rights reserved. CHAPTER 11 ANALYSIS OF ALGORITHM EFFICIENCY ANALYSIS OF ALGORITHM EFFICIENCY.
Chapter 2 The Fundamentals: Algorithms, the Integers, and Matrices
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Discrete Mathematics, 1st Edition Kevin Ferland
Reading and Writing Mathematical Proofs
CSE 20 Lecture 12 Induction CK Cheng 1. Induction Outlines Introduction Theorem Examples: The complexity calculation – Tower of Hanoi – Merge Sort – Fibonacci.
Advance Data Structure and Algorithm COSC600 Dr. Yanggon Kim Chapter 1.
The Pumping Lemma for Context Free Grammars. Chomsky Normal Form Chomsky Normal Form (CNF) is a simple and useful form of a CFG Every rule of a CNF grammar.
1 Sections 1.5 & 3.1 Methods of Proof / Proof Strategy.
The Integers. The Division Algorithms A high-school question: Compute 58/17. We can write 58 as 58 = 3 (17) + 7 This forms illustrates the answer: “3.
Genome Rearrangements Unoriented Blocks. Quick Review Looking at evolutionary change through reversals Find the shortest possible series of reversals.
Genome Rearrangements [1] Ch Types of Rearrangements Reversal Translocation
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Analysis of Algorithms CS 477/677
Discrete Structures Lecture 12: Trees Ji Yanyan United International College Thanks to Professor Michael Hvidsten.
Math 409/409G History of Mathematics Books VII – IX of the Elements Part 3: Prime Numbers.
Copyright © Zeph Grunschlag, Induction Zeph Grunschlag.
Counting nCr = n!/r!(n-r)!=nC(n-r) This equation reflects the fact that selecting r items is same as selecting n-r items in forming a combination from.
Closure Properties Lemma: Let A 1 and A 2 be two CF languages, then the union A 1  A 2 is context free as well. Proof: Assume that the two grammars are.
Basic Concepts of Encoding Codes and Error Correction 1.
COSC 3101A - Design and Analysis of Algorithms 6 Lower Bounds for Sorting Counting / Radix / Bucket Sort Many of these slides are taken from Monica Nicolescu,
SECTION 9 Orbits, Cycles, and the Alternating Groups Given a set A, a relation in A is defined by : For a, b  A, let a  b if and only if b =  n (a)
8.4 Closures of Relations Definition: The closure of a relation R with respect to property P is the relation obtained by adding the minimum number of.
Copyright © Zeph Grunschlag, Induction Zeph Grunschlag.
Department of Statistics University of Rajshahi, Bangladesh
12. Lecture WS 2012/13Bioinformatics III1 V12 Menger’s theorem Borrowing terminology from operations research consider certain primal-dual pairs of optimization.
2 2.2 © 2016 Pearson Education, Ltd. Matrix Algebra THE INVERSE OF A MATRIX.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar Dr Nazir A. Zafar Advanced Algorithms Analysis and Design.
1 Chapter 7 Quicksort. 2 About this lecture Introduce Quicksort Running time of Quicksort – Worst-Case – Average-Case.
 2004 SDU Uniquely Decodable Code 1.Related Notions 2.Determining UDC 3.Kraft Inequality.
Bipartite Graphs What is a bipartite graph?
Chapter 5. Optimal Matchings
Enumerating Distances Using Spanners of Bounded Degree
3.5 Minimum Cuts in Undirected Graphs
Bipartite Graph 1. A graph G is bipartite if the node set V can be partitioned into two sets V1 and V2 in such a way that no nodes from the same set are.
Presentation transcript:

Sorting by placement and Shift Sergi Elizalde Peter Winkler By 資工四 B 周于荃

ABSTRACT & BASIC DEFINITION

Abstract The final destination of each item is known. Just like hand-sort.

Abstract We want to show that in the worst case, the algorithm will terminate after 2 n-1 -1 steps.

Basic definition π is a permutation with n items. Each item is π (1)~ π (n). Any number i with π (i) ≠ i may be “placed” in its proper position. The numbers in positions between i an π (i) shifted up or down as necessary.

Basic definition

We call this operation “Homing”.

FAST HOMING

Fast homing A placement of either the least or the greatest number not currently in its home will be called extremal. The number i will never subsequently be dislodged from its home.

Fast homing - Theorem 4.1 Any algorithm that always places the smallest or largest available number will terminate in at most n- 1 steps. It’s trivial. Just like hand-sort file.

Fast homing - Theorem 4.1 The precise number of steps required is the smallest j such that the files which belong in positions j+1, j+2, …,n are already in the correct order.

Fast homing - Theorem 4.2 The expected number of steps required by random homing from π with n items is at most (n(n+1) - 2)/4. Proof : Mathematic. (#) We say a permutation is in stage k when k of the extremal number are home. Ex. 1,2,3,7,4,6,5,8,9 => stage 5.

Fast homing - Theorem 4.3 Let k be the length of the longest increasing subsequence in π. Then no sequence of fewer than n - k placements can sort π. Proof : Otherwise…

Fast homing - Corollary 4.1 The reverse permutation n,…,1 requires n-1 steps. Easy consequence by theorem 4.3.

Fast homing - Theorem 4.4 The reverse permutation is the only case requiring n-1 steps. By induction on n (#).

SLOW HOMING

Slow Homing Let’s home it with lots of idiot steps… 2,3,4,5,…,n-1,n,1 This permutation will take 2 n-1 -1 if we always place the left-most not-at-home numbers. Familiar with “tower of hanoi” pattern.

Slow homing Larson conjectured that 2 n-1 -1 is the maximum. Indeed, although many other, more complex, permutations can also support 2 n-1 -1 steps, none permit more.

Slow homing – Theorem 5.1 Homing always terminates in at most 2 n-1 -1steps. To prove this, we need lots of lemma and backward analysis.

Slow homing – Evicting Reverse of homing. Choose a number which is at home and displace it. Our goal is to prove that begin with the identity permutation {1,2,…,n}, at most 2 n-1 -1 displacements are possible. By induction on n, trivial when n = 1. Supposed that it’s true when k< n.

Slow homing – Lemma 5.1 After2 n-2 displacements, both 1 and n have been displaced and will never be displaced again. Otherwise… About never being displaced again, it’s trivial when we think it as a reverse of homing.

Slow homing – code and weight We associate with each permutation π a code α ( π ), and with each code α, a weight w = w( α ). The code is a sequence α = (a 2, a 3,…, a n-1 ) of length | α | = n-2 from the alphabet.{+,-,0}.

Slow homing – code and weight a i = “+” if π -1 (i) > i. a i = “ -” if π -1 (i) < i. a i = “0” if π -1 (i) = i.

Slow homing – code and weight The weight w( α ) is defined for codes of all lengths by recursion. If a i = 0 for each i, we put w( α ) = 0. For each i ◦ a i = “-”, let d i = i - 2. ◦ a i = “+”, let d i = n-1-i. Thus, d i represents the number of symbols to the left of a - or to the right of a +.

Slow homing – code and weight

Let i be the index maximizing d i. Let α [i] the code of length | α | -1 obtained by deleting the i th entry of α. w( α ) = w( α [i]) + 2 di.

Slow homing – code and weight

Slow homing – Lemma 5.2 The minimum of w( α ) over codes α of length k is 0, for the all-0 code, and the maximum is 2 k -1, for codes of the form + p - q.

Slow homing – Lemma 5.3 Let α = β + p γ - q δ where | β | = | δ |, β contains no “+”, δ contains no “-”, and γ neither begins with + nor ends with -. Then w( α ) = w( βγδ ) +2 p+| γ |+q+| β | - 2 | γ |+| β |. Proof: mathematic.

Slow homing – Corollary 5.1 The definition of the weight of a code does not depend on how ties are broken when d i = d j. Proof: by lemma 5.3.

Slow homing – Lemma 5.4 For any codes γ and δ, where γ has no +, w( γδ 0) ≦ w( γδ ) + 2 | δ | - 1. Proof : mathematic.

Slow homing – Lemma 5.5 Let α be any code, and β = (b 1, …, b n-1 ) the result of changing some a i = 0 to b i = + or b i = -. Then w( β ) > w( α ).

Slow homing – Proof of Lemma 5.5 The derivations of w( β ) and w( α ) are the same until b i is stripped. Let β ’ and α ’ be the corresponding codes at that point (before b i is stripped). We can write β ’ as γδ b i ε. Where γ contains no +, ε no -, and | ε | ≦ | γ |.

Slow homing – Proof of Lemma 5.5 w( β ’) = w( γδε ) + 2 | γ |+| δ | = w( γδ 0 | ε | ) + w( ε ) + 2 | γ |+| δ | ≧ w( γδ 0 | ε |+1 ) – (2 | δ |+| ε | - 1) + w( ε ) + 2 | γ |+| δ | (by Lemma 5.4) > w( γδ 0 | ε |+1 ) + w( ε ) = w( γδ 0 ε ) = w( α ’):

Slow homing – Lemma 5.6 Let π be any permutation of {1, …, n} in which π (1) ≠ 1 and π (n) ≠ n, and let π ‘ be the result of applying some displacement to π. then w( α ( π ’)) > w( α ( π )).

Slow homing – Proof of Lemma 5.6 A displacement on position i will cause the code α change from 0 to “+”or”-”. Assume α changes into “-”. This displacement will also cause some “-”changing into 0, or some 0 into “+”. We don’t care about “0=>+” because Lemma 5.5 told us it will only increase weight.

Slow homing – Proof of Lemma 5.6 Those number whose code change into 0 with initial code “-” is less than i. In extreme case, we can assume that all the”-” less than i change into 0. Let j be position of the right-most “-” less than i. Let 2 k be the contribution to of the “-” in the position i in th computation of w( α ’).

Slow homing – Proof of Lemma 5.6 If there are any + between a j and a i that are stripped after the “-” in position i in α ’, then their contribution to w( α ’) is less than their contribution to w( α ). Let “t” be the number of such “+”. The total contribution of those “+” to w( α ’) is at most 2 k k-2 +…+2 k-t = 2 k-t (2 t – 1). The difference between their contribution is 2 k-t (2 t – 1), too.

Slow homing – Proof of Lemma 5.6 The total contribution to w( α ) of the ”-” less than a i in α is at most 2 k-t -1, since each adds a different power of 2 less than 2 k-t. In conclude that w( α ’) ≧ w( α ) + 2 k – (2 k-t -1) - 2 k-t (2 t – 1) > w( α ).

Slow homing – Conclusion Theorem 5.1 is an easy consequence of Lemmas and 5.6. Finally we find out that Homing always terminates in at most 2 n-1 -1steps.

COUNTING BAD CASES

Counting bad cases With the analysis above, we can try to find out what permutation are they whose worst case take 2 n-1 -1steps.

Counting bad cases Let the worst case permutation set be M n. Let height h( π ) means the steps taken from identity to π. Let τ n denote the permutation as followed => n, 2, 3, …, n-1, 1

Counting bad cases – Lemma 6.1 h( τ n ) = 2 n-2 Proof : Place n first and that will leave a permutation as 2, 3, …, n-1, 1. In the above analysis, we know that h(2,3,…,n-1,1) = 2 n-2 -1.

Counting bad cases – Lemma 6.2 For any permutation with code α = + i 0 k - j, there is a sequence of 2 k-1 displacements that ends in a permutation with code + i 0 k-1 - j+1. Moreover, all the displacements in the sequence are unique, except for possibly the last one.

Counting bad cases – Lemma 6.2 Induction on k.

Counting bad cases Since w(+ i 0 k-1 - j+1 ) = w(+ i 0 k - j ) + 2 k-1, and by Lemma 6.2 we know that from + i 0 k - j to + i 0 k- 1 - j+1 takes 2 k-1 displacement. So the weight should increase by one at each step. This condition can only hold when the sequence of displacement is the one described above.

Counting bad cases - Firing We call the sequence of displacement as above “firing” i+k+1 “to left”. If the last displacement is done into position i-1, we call that “shortest firing”. In symmetric, there are firing to right.

Counting bad cases – Lemma 6.3 A permutation belongs to M n if and only if it can be obtained from τ n by successively applying n-2 left and right firings.

Counting bad cases – Lemma 6.3 (<=) α ( τ n ) = 0 n-2 The first firing transforms this code into 0 n-3 - using 2 n-3 displacements. The second firing using 2 n-4, and so on. After n-2 firing, the total number of displacement come to 2 n-2 -1,ending with permutation σ whose code is + k - n-2-k.

Counting bad cases – Lemma 6.3 By Lemma 6.1,h( σ ) ≧ 2 n-2 +2 n-2 -1= 2 n-1 -1 By Theorem 5.1, this is an equality. σ belongs to M n.

Counting bad cases – Lemma 6.3 (=>) By Lemma 6.1,any permutation of height 2 n-1 -1 has to be obtained from τ n in 2 n-2 -1 displacements. By Lemma 5.6, each one increasing weight is with one step.

Counting bad cases – Lemma 6.3 If the first displacement on τ n introduces a - to the code, then the first 2 n-3 displacements must constitute a left firing. Otherwise there will be a jump increasing on the weight. Then 2 n-4 displacements constitute the second left firing, and so on.

Counting bad cases - Corollary 6.1 For n ≧ 2, | M n | ≦ (n-1)! It follows from Lemma 6.3, with the fact that there are n+1 choice for the last step of firing.

Counting bad case – Proposition6.1 | M n | ≧ 2 n-2 We show that if we start from τ n and perform only short firings, then no permutation is obtained in more than one way.

Counting bad case – Proposition6.1

MY PRESENT IS OVER.