MS 101: Algorithms Instructor Neelima Gupta

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

February 21, 2002 Simplex Method Continued
Thursday, March 7 Duality 2 – The dual problem, in general – illustrating duality with 2-person 0-sum game theory Handouts: Lecture Notes.
February 14, 2002 Putting Linear Programs into standard form
Comp 122, Spring 2004 Order Statistics. order - 2 Lin / Devi Comp 122 Order Statistic i th order statistic: i th smallest element of a set of n elements.
1 Parallel Algorithms (chap. 30, 1 st edition) Parallel: perform more than one operation at a time. PRAM model: Parallel Random Access Model. p0p0 p1p1.
Counting. Counting in Algorithms How many comparisons are needed to sort n numbers? How many steps to compute the GCD of two numbers ? How many steps.
Online Algorithm Huaping Wang Apr.21
CS 332: Algorithms NP Completeness David Luebke /2/2017.
Three Special Functions
110/6/2014CSE Suprakash Datta datta[at]cse.yorku.ca CSE 3101: Introduction to the Design and Analysis of Algorithms.
Mathematical Induction
Problems and Their Classes
2 x0 0 12/13/2014 Know Your Facts!. 2 x1 2 12/13/2014 Know Your Facts!
2.4 – Factoring Polynomials Tricky Trinomials The Tabletop Method.
Isolation Technique April 16, 2001 Jason Ku Tao Li.
Incremental Linear Programming Linear programming involves finding a solution to the constraints, one that maximizes the given linear function of variables.
NON - zero sum games.
1 Lecture 5 PRAM Algorithm: Parallel Prefix Parallel Computing Fall 2008.
5 x4. 10 x2 9 x3 10 x9 10 x4 10 x8 9 x2 9 x4.
Parallel algorithms for expression evaluation Part1. Simultaneous substitution method (SimSub) Part2. A parallel pebble game.
EMIS 8374 LP Review: The Ratio Test. 1 Main Steps of the Simplex Method 1.Put the problem in row-0 form. 2.Construct the simplex tableau. 3.Obtain an.
Linear Programming – Simplex Method: Computational Problems Breaking Ties in Selection of Non-Basic Variable – if tie for non-basic variable with largest.
A Case Study  Some Advice on How to Go about Solving Problems in Prolog  A Program to Play Connect-4 Game.
Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
Variational Inference Amr Ahmed Nov. 6 th Outline Approximate Inference Variational inference formulation – Mean Field Examples – Structured VI.
Probabilistic sequence modeling II: Markov chains Haixu Tang School of Informatics.
Computational Facility Layout
LINEAR PROGRAMMING Modeling a problem is boring --- and a distraction from studying the abstract form! However, modeling is very important: --- for your.
Section 3.4 The Traveling Salesperson Problem Tucker Applied Combinatorics By Aaron Desrochers and Ben Epstein.
Shannon Expansion Given Boolean expression F = w 2 ’ + w 1 ’w 3 ’ + w 1 w 3 Shannon Expansion of F on a variable, say w 2, is to write F as two parts:
0 x x2 0 0 x1 0 0 x3 0 1 x7 7 2 x0 0 9 x0 0.
Thresholds for Ackermannian Ramsey Numbers Authors: Menachem Kojman Gyesik Lee Eran Omri Andreas Weiermann.
MS 101: Algorithms Instructor Neelima Gupta
Approximation, Chance and Networks Lecture Notes BISS 2005, Bertinoro March Alessandro Panconesi University La Sapienza of Rome.
Lower bound: Decision tree and adversary argument
MS 101: Algorithms Instructor Neelima Gupta
Study Group Randomized Algorithms 21 st June 03. Topics Covered Game Tree Evaluation –its expected run time is better than the worst- case complexity.
MCA 520: Graph Theory Instructor Neelima Gupta
Lecture 17 Path Algebra Matrix multiplication of adjacency matrices of directed graphs give important information about the graphs. Manipulating these.
Outline. Theorem For the two processor network, Bit C(Leader) = Bit C(MaxF) = 2[log 2 ((M + 2)/3.5)] and Bit C t (Leader) = Bit C t (MaxF) = 2[log 2 ((M.
Lecture 12: Lower bounds By "lower bounds" here we mean a lower bound on the complexity of a problem, not an algorithm. Basically we need to prove that.
Deterministic Selection and Sorting Prepared by John Reif, Ph.D. Analysis of Algorithms.
The number of edge-disjoint transitive triples in a tournament.
Lecture 13: Adversary Arguments continues … This lecture, we continue with the adversary argument by giving more examples.
Finding the Median Algorithm : Design & Analysis [11]
1 Introduction to Computability Theory Lecture12: Reductions Prof. Amos Israeli.
1 Introduction to Computability Theory Lecture13: Mapping Reductions Prof. Amos Israeli.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
The k-server Problem Study Group: Randomized Algorithm Presented by Ray Lam August 16, 2003.
1 Separator Theorems for Planar Graphs Presented by Shira Zucker.
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Problem: Selection Design and Analysis: Adversary Arguments The selection problem >  Finding max and min Adversary Arguments( 反论 )  Suppose you are playing.
Lecture 11. Matching A set of edges which do not share a vertex is a matching. Application: Wireless Networks may consist of nodes with single radios,
Ch. 8 & 9 – Linear Sorting and Order Statistics What do you trade for speed?
The Theory of NP-Completeness 1. Nondeterministic algorithms A nondeterminstic algorithm consists of phase 1: guessing phase 2: checking If the checking.
Quantum Computing MAS 725 Hartmut Klauck NTU TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A.
© The McGraw-Hill Companies, Inc., Chapter 6 Prune-and-Search Strategy.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
TECH Computer Science Problem: Selection Design and Analysis: Adversary Arguments The selection problem >  Finding max and min Designing against an adversary.
1 Prune-and-Search Method 2012/10/30. A simple example: Binary search sorted sequence : (search 9) step 1  step 2  step 3  Binary search.
Instructor Neelima Gupta Table of Contents Review of Lower Bounding Techniques Decision Trees Linear Sorting Selection Problems.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
Introduction to Graph Theory
A method for obtaining lower bounds
1 Fault-Tolerant Consensus. 2 Communication Model Complete graph Synchronous, network.
The Selection Algorithm : Design & Analysis [10].
Approximation Algorithms by bounding the OPT Instructor Neelima Gupta
Rank Aggregation.
The Curve Merger (Dvir & Widgerson, 2008)
Presentation transcript:

MS 101: Algorithms Instructor Neelima Gupta

TOC Lower Bounds (adversary)

Lower Bounds Any algorithm to find min and max of n keys by comparison of keys must do at least 3n/2 – 2 comparisons in the worst case. Proof: –Assume that the keys are distinct –Let x denotes max and y denotes min –We must know that every key other than x has lost to some comparison n-1 unit of information required. –Also, we must know that every key other than y has won in some comparison n-1 unit of information required. –Total : 2n – 2 unit of information required to declare that x is max and y is min.

Adversary Devises a strategy that gives away minimum information. Key status Meaning W Has won at least one comparison and never lost L Has lost at least one comparison and never won WL Has won and lost at least one comparison N Has not yet participated in a comparison

Status of keys x and y compared by an algo Adversary chooses New status Units of new information N,Nx >yW,L2 W,N/ WL, Nx > y…,L1 L,Nx < y…, W1 W,Wx > y…,WL1 L,Lx > yWL,…1 W,L/WL,L/W,W L x > yNo change 0 WL,WLConsistent with assigned values No Change 0

Make sure that input constructed is consistent.

Example: Constructing an input using adversary’s rules. Comparison x1 x2 X3 x4 x5 x6 status value status value status value status value status value status value INITIAL N * N * N * N * N * N * X1.X2 X1,X5 X3,X4 X3,X6 X3,X1 X2,X4 X5,X6 X6,X4 TABLE 5.3

Example: Constructing an input using adversary’s rules. Comparison x1 x2 X3 x4 x5 x6 status value status value status value status value status value status value INITIAL N * N * N * N * N * N * X1.X2 X1,X5 X3,X4 X3,X6 X3,X1 X2,X4 X5,X6 X6,X4 TABLE 5.3

Example: Constructing an input using adversary’s rules. Comparison x1 x2 X3 x4 x5 x6 status value status value status value status value status value status value INITIAL N * N * N * N * N * N * X1,X2 W 20 L 10 N * N * N * N * X1,X5 X3,X4 X3,X6 X3,X1 X2,X4 X5,X6 X6,X4 TABLE 5.3

Example: Constructing an input using adversary’s rules. Comparison x1 x2 X3 x4 x5 x6 status value status value status value status value status value status value INITIAL N * N * N * N * N * N * X1,X2 W 20 L 10 N * N * N * N * X1,X5 W 20 L 5 X3,X4 X3,X6 X3,X1 X2,X4 X5,X6 X6,X4 TABLE 5.3

Example: Constructing an input using adversary’s rules. Comparison x1 x2 X3 x4 x5 x6 status value status value status value status value status value status value INITIAL N * N * N * N * N * N * X1,X2 W 20 L 10 N * N * N * N * X1,X5 W 20 L 5 X3,X4 W 15 L 8 X3,X6 X3,X1 X2,X4 X5,X6 X6,X4 TABLE 5.3

Example: Constructing an input using adversary’s rules. Comparison x1 x2 X3 x4 x5 x6 status value status value status value status value status value status value INITIAL N * N * N * N * N * N * X1,X2 W 20 L 10 N * N * N * N * X1,X5 W 20 L 5 X3,X4 W 15 L 8 X3,X6 W 15 L 12 X3,X1 X2,X4 X5,X6 X6,X4 TABLE 5.3

Example: Constructing an input using adversary’s rules. Comparison x1 x2 X3 x4 x5 x6 status value status value status value status value status value status value INITIAL N * N * N * N * N * N * X1,X2 W 20 L 10 N * N * N * N * X1,X5 W 20 L 5 X3,X4 W 15 L 8 X3,X6 W 15 L 12 X3,X1 WL 20 W 25 X2,X4 X5,X6 X6,X4 TABLE 5.3

Example: Constructing an input using adversary’s rules. Comparison x1 x2 X3 x4 x5 x6 status value status value status value status value status value status value INITIAL N * N * N * N * N * N * X1,X2 W 20 L 10 N * N * N * N * X1,X5 W 20 L 5 X3,X4 W 15 L 8 X3,X6 W 15 L 12 X3,X1 WL 20 W 25 X2,X4 WL 10 L 8 X5,X6 X6,X4 TABLE 5.3

Example: Constructing an input using adversary’s rules. Comparison x1 x2 X3 x4 x5 x6 status value status value status value status value status value status value INITIAL N * N * N * N * N * N * X1,X2 W 20 L 10 N * N * N * N * X1,X5 W 20 L 5 X3,X4 W 15 L 8 X3,X6 W 15 L 12 X3,X1 WL 20 W 25 X2,X4 WL 10 L 8 X5,X6 WL 5 L 3 X6,X4 TABLE 5.3

Example: Constructing an input using adversary’s rules. Comparison x1 x2 X3 x4 x5 x6 status value status value status value status value status value status value INITIAL N * N * N * N * N * N * X1,X2 W 20 L 10 N * N * N * N * X1,X5 W 20 L 5 X3,X4 W 15 L 8 X3,X6 W 15 L 12 X3,X1 WL 20 W 25 X2,X4 WL 10 L 8 X5,X6 WL 5 L 3 X6,X4 L 2 WL 3 Key values X1=20, X2=10, X3=25, X4=2, X5=5, X6=3. MAX= X3 MIN =X4 TABLE 5.3

EXPLANATION: 1.First column shows a sequence of comparison that may be carried out by some algorithm 2.Remaining columns shows the status and value assigned to the keys by the adversary. 3.Each row after first comparison contains only the entries relevant to current comparison. 4.In 5 th comparison when X3 and X1 are compared adversary increases the value of X3 because it is supposed to win. 5.In 8 th it changes the value of x4 and x6 consistent with its rules. 6. After first five comparison X3 has lost at least once so it is max. 7.After all comparison X4 is the only key that has never won so it is min. Result: In this example the algorithm did eight comparison. the worst case lower bound for six keys is (3/2) *6-2=7

Proof of theorem 5.1 We have to show that the adversary rule will force an algorithm to do at least 3n-2 comparison to get 2n-2 unit of information. Algorithm can get two units of information only when two keys have not been included in any previous comparison. Suppose n is even: Algorithm does n/2 comparison for unseen keys, so it can get at most n unit of information

Proof of theorem 5.1 cont. Algorithm needs n-2 additional unit of information so it must do at least n-2 more comparison Thus to get 2n-2 information it must do at least n/2+n-2=3n/2-2 comparisons. For odd n: Total no of comparison is 3n/2-3/2

Proof of theorem 5.1 cont. Algorithm needs n-2 additional unit of information so it must do at least n-2 more comparison Thus to get 2n-2 information it must do at least n/2+n-2=3n/2-2 comparisons. For odd n: Total no of comparison is 3n/2-3/2

Lower bound for Largest n Second Largest Any algorithm that works by comparing keys to find the second largest in a list of n keys must do at least n + log n – 2 comparisons. If we can show that `max’ participated in at least log n comparisons we are done because –Any algorithm that computes second largest must compute the largest. Why? To know that x is second largest, it must know that it is not the largest. That is it must loose to at least one comparison ( and in fact exactly one, since if it looses to two or more it is not the second largest.). And, when x looses to exactly one key, it knows that that key must be the largest. –If ‘max’ looses to only one key, we know that that key must be the second largest. But what if ‘max’ looses’ to more than one key. –So, any algorithm must do at least n-1 comparisons (where ‘max’ is compared with the rest of the keys ) and compare the “loosers to max” to get the second largest.

Adversary Assign weight w(x) to each key x. Initially w(x) = 1 for all x.

Adversary modifies the weights as per the following strategy. CaseAdversary response New weights w(x) > w(y)x > yw(x)←w(x) + w(y), w(y) ← 0 w(x) = w(y) > 0x > yw(x)←w(x) + w(y), w(y) ← 0 w(y) > w(x)y > xw(y)←w(x) + w(y), w(x) ← 0 w(x) = w(y) = 0consistent with previous results No change

Observations A key has lost a comparison iff its weight is zero.i.e. once a key has lost a comparison, its weight will never become non zero in future. Proof: Notice that once a key y has lost a comparison its weight becomes zero. Once its weight becomes it can never be compared with a key x whose weight w(x) is less than w(y). Hence w(y) never changes hereafter.

Observations The sum of the weights is always n. This is true initially and it is preserved when weights are updated. When the algorithm stops, only one key has a non-zero weight. Otherwise there would be at least two keys (say a and b) that never lost a comparison. The adversary can give arbitrarily high and distinct values to these two keys. –Now, if the algorithm declares a as largest and b as second largest, adversary can always choose the values the other way round and make the algorithm’s choice of largest and second largest incorrect.

Claim: In any algorithm to compute the largest, the largest must be compared with at least ceil(log n) keys in the worst case. Proof: Let x be the key that has non-zero weight when the algorithm stops. Then clearly x = largest. We’ll show that x has directly won over at least ceil(log n) distinct keys.

Let w k = w(x) be the weight of x after the kth comparison won by x against a previously undefeated key. Then, we’ll prove that w k ≤ 2w k-1 –If x has won against few previously defeated keys in between, its weight does not change. So, –w k = w k-1 + w(y) where y is the previously undefeated key –Clearly, w(y) ≤ w k-1 for else y wouldn’t loose to x. –Thus, w k ≤ 2w k-1 Let K be the number of comparisons x wins against previously undefeated keys, then n = w K ≤ 2 K w 0 = 2 K Thus K ≥ log n or K ≥ ceil(log n )

Assignment 5 Generate a worst case input for largest and second largest problem using the adversary strategy discussed here.