Presentation is loading. Please wait.

Presentation is loading. Please wait.

Rateless Coding with Feedback Andrew Hagedorn, Sachin Agarwal, David Starobinski, and Ari Trachtenberg Department of ECE, Boston University, MA, USA IEEE.

Similar presentations


Presentation on theme: "Rateless Coding with Feedback Andrew Hagedorn, Sachin Agarwal, David Starobinski, and Ari Trachtenberg Department of ECE, Boston University, MA, USA IEEE."— Presentation transcript:

1 Rateless Coding with Feedback Andrew Hagedorn, Sachin Agarwal, David Starobinski, and Ari Trachtenberg Department of ECE, Boston University, MA, USA IEEE INFOCOM 2009 1

2 Outline 1.Introduction 2.Problem Definition 3.Shifted LT (SLT) Codes 4.Experimental Results 5.Conclusion 2

3 Partial Information Transmission Channel with Erasures Transmitter Receiver Input symbolsReceived Symbols 3

4 Partial Information Transmission Channel with Erasures Transmitter Receiver Input symbolsReceived Symbols 4

5 Partial Information Transmission Channel with Erasures Transmitter Receiver Input symbolsReceived Symbols 5

6 Partial Information Transmission Channel with Erasures Transmitter Receiver Input symbolsReceived Symbols 6

7 Partial Information Transmission Channel with Erasures Transmitter Receiver Input symbolsReceived Symbols 7

8 Partial Information Transmission Channel with Erasures Transmitter Receiver Input symbolsReceived Symbols 8

9 Partial Information Multiple Receivers may have different erasures Transmitter Receiver 1 Receiver 2 Receiver 3 Given the situation of multiple receivers having partial information, how can all of them be updated to full information efficiently, and over a broadcast channel? 9

10 Partial Information Multiple mobile devices may have out-dated information a.Mobile databases b.Sensor network information aggregation c.RSS updates for devices Broadcaster Mobile device 1 Mobile device 2 Mobile device 3 Latest version of information 10

11 Problem Definition Given an encoding host with k input symbols and a decoding host with n out of the k input symbols, the goal is to efficiently determine the remaining k-n input symbols at the decoding host. – The encoding host has no information of which k- n input symbols are missing at the decoding host – Different decoding hosts may be missing different input symbols Efficiency 1.Communication complexity – Information transmitted from the encoding host to the decoding host should be close in size to the transmission size of the missing k-n input symbols 2.Computational complexity – The algorithm must be computationally tractable 11

12 Contribution of this paper – Show that a small amount of feedback, whereby receivers periodically inform the broadcasting sources about the number of successfully decoded input packets, can lead to major communication, memory, and energy usage gains through a judicious modification of the encoding procedure. 12

13 Rateless Codes - Encoding Used for content distribution over error-prone channels Random choice of edges based on a probability density function At least k Encoded Symbols k input symbols 1 =A+B 2 =B 3 =A+B+C 4 =A+C A B C 13

14 Rateless Codes - Decoding Used for content distribution over error-prone channels At least k Encoded Symbols 1 =A+B 2 =B 3 =A+B+C 4 =A+C k input symbols Solve Gaussian Elimination, Belief Propagation System of Linear Equations Irrespective of which encoded symbols are lost in the communication channel, as long as sufficient encoded symbols are received, the decoding can retrieve all the k input symbols A B C 14

15 Decoding Using Belief Propagation Decoded k Input Symbols k+  Encoded Symbols Decoding host Redundant! Decode Input Symbols 15

16 Digital Fountain Codes LT Codes 1.Class of rateless erasure codes invented by Michael Luby 1 2.Computationally practical (as compared to Random Linear Codes) 3.Fast decoding algorithm based on Belief propagation instead of Gaussian Elimination 4.Form the outer code for Raptor Codes 3, which have linear decoding computational complexity 5.Designed for the case when no input symbols are available at the Decoding host initially Asymptotic Properties 2 Expected number of encoded symbols required for successful decoding Expected decoding computational complexity k: number of input symbols 2 Assuming a constant probability of failure  1 Michael Luby, “LT codes,” in The 43rd Annual IEEE Symposium on Foundations of Computer Science, 2002, pp. 271–282. 3 Amin Shokrollahi, “Raptor codes,” IEEE Transactions on Information Theory, vol. 52, no. 6, 2006, pp. 2551–2567. 16

17 Digital Fountain Codes LT Codes’ Robust Soliton Probability Distribution Robust Soliton Probability Distribution  k, Probability of an encoded symbol with degree d is  k (d) Property of releasing degree 1 symbols at a controlled, near- constant rate throughout the decoding process LT code distribution, k = 1000, c = 0.01,   = 0.5. 17

18 18 Real-Time Oblivious Erasure Correcting Amos Beimel, Shlomi Dolev, and Noam Singer IEEE-Information Theory Workshop 2004, San Antonio, Texas [3] Amos Beimel, Shlomi Dolev, and Noam Singer, “Rt oblivious erasure correcting”, IEEE/ACM Trans. Netw., vol. 15, no. 6, pp. 1321–1332, 2007.

19 19 Traditional Erasure Codes kmessage Decoding kmessage Encoding n>ksymbols Transmission Channel Sender Receiver  kreceived XXXXXX Rate-Less Codes n can be ∞

20 20 Motivation Problem – Channels with high loss rate – Expensive feed-back channels – Weak receiving devices Current solutions – ARQ – Requires large feed-back – Erasure Codes – Higher Encoding/Decoding complexity, a single feedback message Our goal – Combine their benefits.

21 21 Real-Time Codes Complexity – Fast symbols generation – Efficient message decoding – Balanced decoding over the entire transmission Decoding rate – Rate in which symbols are decoded

22 22 Protocol Description Encoded Symbols Feed-back d=3 Calculate degree d Randomly pick d symbols XOR these symbols Transmit encoded symbols Check if exactly 1 symbol missing If so, decode the missing symbol Dump the encoded symbol Transmit the number of decoded symbols r r=4 d=4

23 23 Conclusions of RT Codes A combined approach between ARQ and Erasure Codes Low memory overhead Low feedback - O(√k) messages

24 Inefficiency of LT Codes for our Problem k+  Encoded Symbols Decoding host Decode Input Symbols n out of k input symbols are known a priori at the decoding host Many redundant encoded symbols 24

25 Inefficiency of LT Codes for our Problem The number of these redundant encoded symbols grows with the ratio of input symbols known at the decoder (n) to the total input symbols (k) If n input symbols are known a priori, then an additional LT-encoded symbol will provide no new information to the decoding host with probability …which quickly approaches 1 as n → k 25

26 Intuitive Fix n known input symbols serve the function of degree 1 encoded symbols, disproportionately skewing the degree distribution for LT encoding We thus propose to shift the Robust Soliton distribution to the right in order to compensate for the additional functionally degree 1 symbols Questions – 1) How? – 2) By how much? 26

27 Shifted Code Construction Definition The shifted robust soliton distribution is given by – k : the number of input symbols in the system – n : the number of input symbols already known at the decoder – round( ・ ) rounds to the nearest integer Intuition n known input symbols at the decoding host reduce the degree of each encoding symbols by an expected fraction 27

28 Shifted Code Distribution LT code distribution and proposed Shifted code distribution, with parameters k = 1000, c = 0.01,  = 0.5. The number of known input symbols at the decoding host is set to n = 900 for the Shifted code distribution. The probabilities of the occurrence of encoded symbols of some degrees is 0 with the shifted code distribution. 28

29 Shifted Code – Communication Complexity Lemma IV.2 A decoder that knows n of k input symbols needs encoding symbols under the shifted distribution to decode all k input symbols with probability at least 1− . Proof We have k-n input symbols comprising the encoded symbols after the n known input symbols are removed from the decoding graph. The expresson follows from Luby‘s analysis. 29

30 Shifted Code – Average Degree of Encoded Symbol Lemma IV.3 – The average degree of an encoding node under the  k,n distribution is given by Proof – The proof follows from the definitions, since a node with degree d in the μ k distribution will correspond to a node with degree roughly in the shifted code distribution. From Luby‘s analysis,the expresson for the average degree of an LT encoded symbol is 30

31 Shifted Codes – Computational Complexity Lemma IV.4 – For a fixed , the expected number of edges E removed from the decoding graph upon knowledge of n input symbols at the decoding host is given by E = O (n ln(k − n)) Theorem IV.5 – For a fixed probability of decoding failure , the number of operations needed to decode using a shifted LT code (SLT) is O (k ln(k − n)) *Proof described in: S. Agarwal, A. Hagedorn and A. Trachtenberg, “Rateless Codes Under Partial Information”, Information Theory and Applications Workshop, UCSD, San Diego, 2008 31

32 Heuristics for practical implementation 1) Non-uniform restriction on feedback – In fact, most input symbols are decoded after n surpasses a certain value n = αk, 0 ≤ α ≤ 1. – A feedback message containing the most recent value of n is sent only when the average degree changes by a constant (since the previous feedback). – When n < n NU, the average degree of an encoding symbol increases by 32

33 Heuristics for practical implementation – We limit the feedback to every time the average degree changes by √ log k (from its value at the previous feedback), leading to approximately 1/2√ k feedbacks (obtained by dividing (4) by √logk). – When n ≥ n NU, the heuristic sends at most √ k feedbacks, one each time the degree changes by (at least) √ log k. – This heuristic sends O( √ k) feedbacks as n increases from 0 to k, which is equal to the RT code’s feedback. 33

34 Heuristics for practical implementation 2) Uniform restriction on feedback – The current value of n is communicated back to the encoder every time n increases by √k, resulting in √k feedbacks as n increases from 0 to k – This heuristic has the advantage of not congesting the feedback channel toward the end of decoding, unlike RT codes and the non-uniform restriction on feedback. 34

35 35 Fig. 1. Feedback strategies for uniform and non-uniform restrictions on Shifted LT and RT codes. Each circle qualitatively corresponds to a situation in which the current value of n is fed back to the encoder.

36 Simulation Results c = 0.9 and δ = 0.1 In each round of the simulation an encoded packet is generated and transmitted, and decoding is attempted on the received packet (as well as any stored in memory) at the decoder. If an input symbol is recovered then feedback is sent as dictated by each code. 36

37 Simulation Results For k=500, on average Shifted LT codes requires 59% less redundancy than RT codes and 21% less redundancy than LT codes (on average, over 100 trials). The feedback channel communication complexity for Shifted LT codes is greater than either RT codes or LT codes. While RT codes is limited by the changes in its degree and LT codes transmits no feedback, the Shifted LT code transmits feedback every time it recovers one or more input symbols. 37

38 Memory usage 38 Fig. 2. Memory usage at the decoder as a function of the number of transmitted symbols.

39 Number of encoded symbols required 39 Fig. 3. Number of encoded symbols required to disseminate all k input symbols.

40 Number of feedback messages sent 40 Fig. 4. The number of feedback messages sent for the different codes for increasing number of input symbols k. The “Shifted LT - no restriction” transmits too many (O(k)) feedbacks and has been left out of this figure.

41 Number of encoded symbols needed 41 Fig. 5. The number of encoded symbols needed to decode 100 input symbols, as a function of the feedback channel rate.

42 Number of encoded symbols needed 42 Fig. 6. The number of encoded symbols needed to decode 100 input symbols, as a function of the feedback channel loss rate. The forward channel loss rate is fixed at 5%.

43 43 Fig. 7. The number of encoded symbols needed to decode 100 input symbols at 50 receiving nodes, for various forwarded packet loss probabilities.

44 Computational load on the motes 44 Fig. 8. The amount of time required to decode a randomly chosen encoded packet, as a function of the number of encoded symbols already transmitted. 2 TelosB motes, one mote serves a single page (consisting of multiple packets) to the other mote.

45 Total number of packets transmitted 45 Fig. 9. The total number of packets transmitted on forward and feedback channels in order to disseminate a 5 page program to 10 motes using variants of the Deluge over- the-air programming protocol. 11 motes, one of which broadcast 5 pages in memory (totally 11.5K) to the 10 other motes. All feedback channels from the 10 motes to the broadcaster were set to have a 5% packet loss rate, and the forward channel loss rates were varied from 0% to 9%.

46 Total energy used 46 Fig. 10. Total energy used by all the motes for communication and decoding during the dissemination of a 5 page program using a variant of the Deluge over-the-air programming protocol.

47 Conclusion Shifted LT codes provide an easily implemented improvement over existing rateless codes, LT codes and RT codes. The corresponding improvements in communication complexity, energy usage, and, in certain cases, memory requirements are even starker within a broadcast. 47

48 References [3] Amos Beimel, Shlomi Dolev, and Noam Singer, “Rt oblivious erasure correcting”, IEEE/ACM Trans. Netw., vol. 15, no. 6, pp. 1321–1332, 2007. [4] J.W. Hui and D. Culler, “The dynamic behavior of a data dissemination protocol for network programming at scale.”, in SenSys’04, Baltimore, Maryland, USA, Nov. 2004. [10] A. Hagedorn, D. Starobinski, and A. Trachtenberg, “Rateless deluge: Over-the-air programming of wireless sensor networks using random linear codes”, in IPSN ’08: Proceedings of the 7th International Conference on Information Processing in Sensor Networks, 2008. [11] M. Rossi, G. Zanca, L. Stabellini, R. Crepaldi, A. F. Harris, and M. Zorzi, “Synapse: A network reprogramming protocol for wireless sensor networks using fountain codes”, in SECON ’08: Proceedings of the IEEE Conference on Sensor, Mesh and Ad Hoc Communications and Networks, 2008. [13] S. Kokalj-Filipovic, P. Spasojevic, E. Soljanin, and R. Yates, “Arq with doped fountain decoding”, in ISSSTA 08’: International Symposium on Spread Spectrum Techniques and Applications, 2008. [14] S. Agarwal, A. Hagedorn, and A. Trachtenberg, “Rateless codes under partial information”, in ITA ’08: Information Theory and Applications Workshop, 2008. [17] Phil Levis, “Tossim: Accurate and scalable simulation of entire tinyos applications”, in In Proceedings of the First ACM Conference on Embedded Networked Sensor Systems (SenSys 2003), 2003. Weiyao Xiao, Sachin Agarwal, David Starobinski, Ari Trachtenberg: Reliable Wireless Broadcasting with Near-Zero Feedback. IEEE INFOCOM 2010: 2543-2551 48


Download ppt "Rateless Coding with Feedback Andrew Hagedorn, Sachin Agarwal, David Starobinski, and Ari Trachtenberg Department of ECE, Boston University, MA, USA IEEE."

Similar presentations


Ads by Google