Download presentation

Presentation is loading. Please wait.

Published byLillie Lorance Modified over 3 years ago

1
Ulams Game and Universal Communications Using Feedback Ofer Shayevitz June 2006

2
Introduction to Ulams Game Are you familiar with this game? Are you familiar with this game? How many y/n questions are needed to separate 1000 objects? How many y/n questions are needed to separate 1000 objects? M objects log 2 (M) questions M objects log 2 (M) questions

3
What Happens When We Lie? Separate two objects - One lie allowed Separate two objects - One lie allowed Precisely three questions are required ! Precisely three questions are required ! Separate M objects – One lie allowed Separate M objects – One lie allowed 2log 2 (M) + 1 questions are sufficient! 2log 2 (M) + 1 questions are sufficient! But we can do better… But we can do better… It was shown [Pelc87] that the minimal # of questions is the least positive integer n satisfying It was shown [Pelc87] that the minimal # of questions is the least positive integer n satisfying M objects, L lies – Very Difficult ! M objects, L lies – Very Difficult !

4
Ulams Game as a Problem of Reliable Communications Alice (Transmitter) Bob (Receiver) Charlie (Adversary) Feedback Channel Forward Channel

5
Communication Rate Defined Alice transmits one of M possible messages by saying yes/no = 1 bit Alice transmits one of M possible messages by saying yes/no = 1 bit M messages log 2 (M) bits M messages log 2 (M) bits The channel can be used n times (seconds) The channel can be used n times (seconds) Charlie can lie a fraction p of the time no more than np lies (errors) Charlie can lie a fraction p of the time no more than np lies (errors) Define the communication rate R Define the communication rate R

6
Channel Capacity Defined A (M, n ) transmission scheme an agreed procedure of questions/answers between Alice and Bob A (M, n ) transmission scheme an agreed procedure of questions/answers between Alice and Bob A reliable scheme After n seconds the message is correctly decoded by Bob A reliable scheme After n seconds the message is correctly decoded by Bob If for any n there is a (M,n) reliable scheme with rate R we say R is Achievable If for any n there is a (M,n) reliable scheme with rate R we say R is Achievable Capacity C( p ) Maximal achievable rate Capacity C( p ) Maximal achievable rate C( 0 ) = ? C( 0 ) = ?

7
Capacity Behavior Claim: Two messages can always be correctly decoded for p < ½ Claim: Two messages can always be correctly decoded for p < ½ Proof: Proof: Message is S { 1,2 } Message is S { 1,2 } Alice says: Alice says: Yes n times for S= 1 Yes n times for S= 1 No n times for S= 2 No n times for S= 2 How will Bob decode? How will Bob decode? Using a Majority Rule Always correct Using a Majority Rule Always correct Rate for two messages Rate for two messages Corollary: Can transmit with Rate zero for p < ½ (even without feedback…) Corollary: Can transmit with Rate zero for p < ½ (even without feedback…)

8
Capacity Behavior Claim: C( p )= 0 for p. Claim: C( p )= 0 for p. Proof: No reliable three messages scheme exists Rate > 0 is not achievable Proof: No reliable three messages scheme exists Rate > 0 is not achievable Assume p =, n = 3E+1 seconds Assume p =, n = 3E+1 seconds Message is S {1,2,3} Message is S {1,2,3} General strategy: Ask if S=1,2 or 3 General strategy: Ask if S=1,2 or 3 Bob Counts negative votes against possible messages Bob Counts negative votes against possible messages S has votes as the number of lies S has votes as the number of lies Optimal Decision: Bob Chooses message with least votes (why?) Optimal Decision: Bob Chooses message with least votes (why?) Success: Only S has E (~ n) votes or less (why?) Success: Only S has E (~ n) votes or less (why?)

9
Capacity Behavior – Cont. Charlies strategy: Cause two messages to have E votes or less Charlies strategy: Cause two messages to have E votes or less First – Vote against the single message First – Vote against the single message When a message accumulates E +1 votes it is out of the race When a message accumulates E +1 votes it is out of the race If not - all messages have E votes or less… If not - all messages have E votes or less… Now – always vote against the message with the least votes Now – always vote against the message with the least votes Result: Charlie Always votes against only one competitive message Result: Charlie Always votes against only one competitive message

10
Capacity Behavior – Cont. Total # of votes against competitive messages: Total # of votes against competitive messages: Before the 3 rd message was out both competitive messages had no more than E votes Before the 3 rd message was out both competitive messages had no more than E votes After That, they are balanced and their sum cannot exceed 2E After That, they are balanced and their sum cannot exceed 2E Conclusion: Both messages have no more than E votes each Cannot separate them ! Conclusion: Both messages have no more than E votes each Cannot separate them !QED

11
Capacity Bounds [Berlekamp64] The Entropy Function:

12
Our Result

13
When fraction of lies is unknown in advance, Capacity is zero classically But we can get a positive Rate!

14
Results Properties No need to know fraction of lies (errors) in advance No need to know fraction of lies (errors) in advance Constructive – A specific transmission scheme is introduced Constructive – A specific transmission scheme is introduced Variable Rate – Better channel, higher Rate Variable Rate – Better channel, higher Rate Attains optimal Rate (not elaborated) Attains optimal Rate (not elaborated) Penalty – Negligible error probability, goes to zero with increasing n Penalty – Negligible error probability, goes to zero with increasing n Key Idea – to mislead Charlie Key Idea – Randomization to mislead Charlie

15
Taking a Hard Turn…

16
Message Point Representation A message is a bit-stream b 1,b 2,b 3,…. A message is a bit-stream b 1,b 2,b 3,…. Can also be represented by a point Can also be represented by a point Start with the Start with the Unit Interval [0,1) If b 1 Otherwise take If b 1 =0 take [0,½), Otherwise take [½,1) Assume b 1 Assume b 1 =0: If b 2 take If b 2 =0 take [0, ¼) Otherwise take Otherwise take [¼,½) The finite bit-stream b 1,b 2,b 3,…,b k is represented by a of length The finite bit-stream b 1,b 2,b 3,…,b k is represented by a binary interval of length 2 -k The infinite bit-stream is represented by a The infinite bit-stream is represented by a message point ω = 0. b 1 b 2 b 3 ….

17
Transmission of a Message Point First assume no lies (errors) First assume no lies (errors) Message point can be any point in Message point can be any point in [0,1) Assume Alice transmits a zero Assume ω < ½ Alice transmits a zero Otherwise, transmits a one Otherwise, transmits a one Now Bob knows resides in Now Bob knows ω resides in [0,½) If is in transmit another zero If ω is in [0, ¼) transmit another zero If is in transmit a one If ω is in [¼,½) transmit a one In fact, Alice transmits the message bits… In fact, Alice transmits the message bits…

18
Now with Lies… Let p be the precise fraction of lies Let p be the precise fraction of lies Assumption I: we know p ( and also p < Assumption I: we know p ( and also p < ½) If Alice transmits a zero If ω < ½ Alice transmits a zero Otherwise, transmits a one Otherwise, transmits a one Bob thinks is more likely to be in, but is also possible… Bob thinks ω is more likely to be in [0,½), but [½,1) is also possible… How can that notion be quantified ? How can that notion be quantified ? What should Alice transmit next? What should Alice transmit next?

19
Message Point Density We define a density function over the unit interval We define a density function over the unit interval The density function describes our level of confidence (at time k ) of the various possible message point positions The density function describes our level of confidence (at time k ) of the various possible message point positions We require for all k We require for all k Alice steers Bob in the direction of Alice steers Bob in the direction of ω Bob gradually zooms in on Bob gradually zooms in on ω Based on a scheme for a different setting by [Horstein63] Based on a scheme for a different setting by [Horstein63]

20
Start with a uniform density a 0 is the median point of

21
- Density given the received bit - Density given the received bit a 1 is the median point of

22
- Density given the two received bits - Density given the two received bits a 2 is the median point of

23
- Density given the three received bits - Density given the three received bits a 3 is the median point of

24
Hopefully after a long time…

25
Things to be noted… After k iterations k+1 intervals within each is constant After k iterations k+1 intervals within each is constant lies in one of them, the message interval ω lies in one of them, the message interval. is multiplied by 2p if an error occurred at time k. is multiplied by 2p if an error occurred at time k Multiplied by 2(1-p) otherwise Multiplied by 2(1-p) otherwise There are exactly np errors, therefore: There are exactly np errors, therefore:

26
Another Assumption We Assumed we know p (Assumption I) We Assumed we know p (Assumption I) Assumption II – Bob knows the message interval when transmission ends… Assumption II – Bob knows the message interval when transmission ends… These assumptions will be later removed These assumptions will be later removed If the message interval size is 2 -L then: If the message interval size is 2 -L then:

27
Transmission Rate Message interval size 2 -L bits can be decoded Message interval size 2 -L bits can be decoded The bit Rate is at least The bit Rate is at least which tends to as required which tends to as required

28
Assumption I - Removed p is unknown p is unknown But Alice knows p at the end ! But Alice knows p at the end ! Idea – Use an estimate for p, based on what Alice observed so far Idea – Use an estimate for p, based on what Alice observed so far Define a noise sequence Define a noise sequence A reasonable estimate is the noise sequences empirical probability : A reasonable estimate is the noise sequences empirical probability : Bias needed for uniform convergence Bias needed for uniform convergence

29
This probability estimation is the KT estimate [KrichvskyTrofimov81] This probability estimation is the KT estimate [KrichvskyTrofimov81] Using the KT estimate we get Using the KT estimate we get By KT estimate properties we get By KT estimate properties we get Which results in Rate Which results in Rate So asymptotically, we loose nothing ! So asymptotically, we loose nothing !

30
Assumption I* Added… We made an absurd assumption here – Did you notice? We made an absurd assumption here – Did you notice? Bob (receiver) must know as well ! Bob (receiver) must know as well ! Equivalent to knowing the noise sequence… Equivalent to knowing the noise sequence… Assumption I*: can be updated once per B seconds (still needs explaining..) Assumption I*: can be updated once per B seconds (still needs explaining..) B=B(n) is called the block size, may depend on n B=B(n) is called the block size, may depend on n It can be shown that It can be shown that So we require So we require

31
Update Information (UI) Assume seconds Assume seconds UI elements: UI elements: # of ones in the noise sequence in the last block # of ones in the noise sequence in the last block options bits options bits Current message interval options bits Current message interval options bits Must provide Bob with UI once per block Must provide Bob with UI once per block UI is about bits per seconds UI is about bits per seconds Therefore, UI Rate is (key point!!) Therefore, UI Rate is (key point!!)

32
IF Alice can reliably convey UI to Bob then We are done!

33
Reliable UI – Is That Possible? Old Problem: Charlie may corrupt UI… Old Problem: Charlie may corrupt UI… Different from the original problem? Different from the original problem? Yes - UI Rate approaches zero ! Yes - UI Rate approaches zero ! Remember, Rate zero can be attained for p < ½ ! Remember, Rate zero can be attained for p < ½ ! Solutions Outline: Solutions Outline: Random positions per block are agreed via feedback Random positions per block are agreed via feedback Bob Estimates if p ½ in each block: Bob Estimates if p ½ in each block: Alice transmits all zeros over random positions Alice transmits all zeros over random positions Bob finds fraction of ones received Bob finds fraction of ones received Alice transmits UI over random positions per block Alice transmits UI over random positions per block Alice repeats each UI bit several times Alice repeats each UI bit several times Bob decodes each bit by majority/minority rule Bob decodes each bit by majority/minority rule Bad blocks ( p ~ ½) are thrown away Bad blocks ( p ~ ½) are thrown away

34
Reliable UI – Cont. Penalty: Bad estimate Error ! Penalty: Bad estimate Error ! Can show that error probability tends to zero Can show that error probability tends to zero Throwing Bad blocks Random Rate Throwing Bad blocks Random Rate Probability of throwing a good block is small Probability of throwing a good block is small Rate approaching is attained with probability Rate approaching is attained with probability

35
Summary Ulams game introduced Ulams game introduced Analogy to communications with adversary and feedback Analogy to communications with adversary and feedback Classical results presented Classical results presented Can do much better with randomization! Can do much better with randomization! Higher Rate Higher Rate Rate Adaptive to channel (Charlie) behavior Rate Adaptive to channel (Charlie) behavior Penalty – Vanishing error probability Penalty – Vanishing error probability

36
Further Results Much higher Rates possible using structure in the noise sequence (Charlies strategy) Much higher Rates possible using structure in the noise sequence (Charlies strategy) Example: Assume Charlie lies and tells the truth alternately Example: Assume Charlie lies and tells the truth alternately so our scheme attains Rate zero so our scheme attains Rate zero But Alice can notice this stupid strategy ! But Alice can notice this stupid strategy ! Alice can lie in purpose to cancel Charlies lies Alice can lie in purpose to cancel Charlies lies Related to universal prediction and universal compression (Lempel-Ziv) of individual sequences Related to universal prediction and universal compression (Lempel-Ziv) of individual sequences Generalizations to multiple-choice questions Generalizations to multiple-choice questions

37
Thank You!

Similar presentations

OK

Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University.

Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on content development manager Ppt on statistics and probability notes Free ppt on rainwater harvesting Ppt on national education day essay Ppt on united nations Ppt on aerobics workout Ppt on training need assessment and action research Ppt on formal education will make you a living Ppt on power sharing for class 10 download Free ppt on customer service skills