Presentation is loading. Please wait.

Presentation is loading. Please wait.

LT-AF Codes: LT Codes with Alternating Feedback Ali Talari and Nazanin Rahnavard Oklahoma State University IEEE ISIT (International Symposium on Information.

Similar presentations


Presentation on theme: "LT-AF Codes: LT Codes with Alternating Feedback Ali Talari and Nazanin Rahnavard Oklahoma State University IEEE ISIT (International Symposium on Information."— Presentation transcript:

1 LT-AF Codes: LT Codes with Alternating Feedback Ali Talari and Nazanin Rahnavard Oklahoma State University IEEE ISIT (International Symposium on Information Theory) 2013 1

2 Outlines Introduction LT-AF codes Simulation results Conclusions 2

3 Introduction Rateless codes require only one feedback that is issued by the decoder to inform the encoder of a successful LT decoding. LT codes, the available feedback channel remains under utilized during the transmission. As the data-block length decreases the performance of LT codes significantly deteriorates. Existing work have proposed to employ feedbacks to inform the encoder – the number of successfully decoded input symbols, – a suitable input symbol to be sent for enhancing the decoding, – the index of some recovered input symbols. 3

4 Introduction We propose LT Codes with Alternating Feedback (LT-AF Codes) that considerably improve the performance of LT codes for short-block lengths when Belief Propagation decoder is in use. The decoder alternatively issues two types of feedbacks based on the dependencies of the still undecoded received output symbols and the number of decoded input symbols. In contrast to other existing work, we design LTAF codes with a realistic feedback channel assumption, where the feedback channel can have unknown or varying erasure rate ε fb [0, 1). 4

5 LT-AF codes γ succ : the required coding overhead to have a successful decoding with high probability γ succ × k coded symbols are enough to decode k input symbols with high probability γ : the received coding overhead (meanwhile the transmission is in progress), 0 < γ < γ succ γ × k : the number of received output symbols at receiver We exploit the feedback channel to obtain a much smaller γ succ for a finite k in LT-AF coding. 5

6 LT-AF codes Ω k,n (.) : the degree distribution of LT-AF codes for a data-block of length k when n input symbols are already recovered at decoder. We adopt the idea of SLT codes [6], and propose to shift Ω k,n (.) based on n. We allow the decoder to issue the first type of feedback referred to as fb 1, which is used to keep the encoder updated with the current value of n. The encoder generates an output symbol of degree-one (containing a randomly selected input symbol) as acknowledgment. 6 [6] A. Hagedorn, S. Agarwal, D. Starobinski, and A. Trachtenberg, “Rateless coding with feedback,” IEEE INFOCOM 2009, pp. 1791 –1799, 2009. http://www.powercam.cc/slide/24647http://www.powercam.cc/slide/24647

7 Inefficiency of LT Codes for our Problem k+  Encoded Symbols Decoding host Decode Input Symbols n out of k input symbols are known a priori at the decoding host Many redundant encoded symbols 7 http://www.powercam.cc/slide/24647

8 Inefficiency of LT Codes for our Problem The number of these redundant encoded symbols grows with the ratio of input symbols known at the decoder (n) to the total input symbols (k) If n input symbols are known a priori, then an additional LT-encoded symbol will provide no new information to the decoding host with probability …which quickly approaches 1 as n → k 8 http://www.powercam.cc/slide/24647

9 Intuitive Fix n known input symbols serve the function of degree 1 encoded symbols, disproportionately skewing the degree distribution for LT encoding We thus propose to shift the Robust Soliton distribution to the right in order to compensate for the additional functionally degree 1 symbols Questions – 1) How? – 2) By how much? 9 http://www.powercam.cc/slide/24647

10 Shifted Code Construction Definition The shifted robust soliton distribution is given by – k : the number of input symbols in the system – n : the number of input symbols already known at the decoder – round( ・ ) rounds to the nearest integer Intuition n known input symbols at the decoding host reduce the degree of each encoding symbols by an expected fraction 10 http://www.powercam.cc/slide/24647

11 Shifted Code Distribution LT code distribution and proposed Shifted code distribution, with parameters k = 1000, c = 0.01,  = 0.5. The number of known input symbols at the decoding host is set to n = 900 for the Shifted code distribution. The probabilities of the occurrence of encoded symbols of some degrees is 0 with the shifted code distribution. 11 http://www.powercam.cc/slide/24647

12 LT-AF codes We propose to employ Ideal Soliton distribution in LT-AF coding and exploit the feedback channel and request a suitable input symbol (which is an output symbol of degree 1) We allow the decoder to request desired input symbols employing the second type of feedback referred to as fb 2 The encoder generates a degree-one output symbol if and only if it has received a fb 1 or fb 2 12

13 LT-AF codes The lack of the arrival of an output symbol at the decoder with degree one after issuing a fb 1 or fb 2 clearly indicates a feedback loss. All feedback packet losses are identified by the decoder and a feedback retransmission is performed. The decoding recovery rate of LT-AF codes does not considerably degrade at high feedback channel loss rates ε fb [0, 1) A degree-one output symbol generated at the encoder after a fb 1 contains a randomly selected input symbol. However, such an output symbol would contain the requested input symbol (selected by decoder) after a fb 2 13

14 LT-AF codes Let,where is the probability of selecting degree d to generate an LT-AF output symbol We do not allow the encoder to generate any degree-one symbol, we set 14

15 LT-AF codes The average degree of a check node (output symbol) generated employing Ω k,n (.) distribution is 15

16 Generating fb 1 The decoder is not always aware of n unless its knowledge about n is updated by a fb 1 Initially, the encoder assumes n = 0 and employs the degree distribution Ω k,0 (.) to generate output symbols. n r : the most recent reported value of n employing a fb 1 We propose the encoder to generate a fb 1 when, i.e., average degree of Ω k,n (.) increases by at least n i : the threshold that for n ≧ n i the i th fb 1 is generated 16

17 Generating fb 1 17

18 Generating fb 1 18 At k = 10 2 the first and the second fb 1 ’s are issued at n ≧ 39 and n ≧ 58 For k = 10 4 the first and the second fb 1 ’s are issued at n ≧ 2740 and n ≧ 4346 n i / k decreases as k increases. 39 4346 2740 58

19 Generating fb 2 Transmission of fb 2 ’s before γ = 1 does not considerably contribute to decoding progress. We propose to generate fb 2 ’s only when γ surpasses 1. To have uniformly distributed fb 2 ’s and to avoid feedback channel congestion, an LT-AF decoder issues a fb 2 on the reception of every (ln k) th output symbol. 19

20 Generating fb 2 1) Generating fb 2 Based on Variable Node with Maximum Degree (VMD) – Decoder requests the variable node with the highest degree in its current decoding graph to issue a fb 2 – VMD greedily removes the largest possible number of edges from G and decreases the degree of many check nodes. – V un : the set of remaining undecoded variable nodes – C buff : the set of buffered check nodes with a degree higher than one – the ripple : the check nodes with degree 1 20

21 Generating fb 2 VMD would request v 5 On the arrival of c 8 containing only v 5, the value of v 5 will become known. This removes all the edges emanating from v 5 to all other check nodes and reduces some to degree 1 (they are added to the ripple). C 7 is added to the ripple, which recovers v 7 in the next decoding iteration. Fig. 2. The bipartite graph representing the input and the output symbols of an LT-AF code at the buffer of a decoder. Graph G at a decoder at γ = 1 for k = 7. 21

22 22 v1v1 v2v2 v3v3 v4v4 v5v5 v6v6 v7v7 c1c1 c2c2 c3c3 c4c4 c5c5 c6c6 c7c7 v1v1 v2v2 v3v3 v4v4 v5v5 v6v6 v7v7 c1c1 c2c2 c3c3 c4c4 c5c5 c6c6 c7c7 c8c8

23 23 v1v1 v2v2 v3v3 v4v4 v6v6 v7v7 c1c1 c2c2 c3c3 c4c4 c5c5 c6c6 c7c7 c8c8 the ripple : c 7 C buff = {c 1, c 2,..., c 6 } V un = {v 1, v 2, v 3, v 4, v 6 }

24 Generating fb 2 2) Generating fb 2 Based on Full Variable Node Decoding(FVD) – A more complex method to generate fb 2 is to run a dummy decoding for all unrecovered input symbols. – The single input symbol whose delivery results in the highest number of decoded input symbols is requested by a fb 2 – Full Variable Node Decoding (FVD) has a much higher complexity than VMD. 24

25 Simulation Results Our results are obtained employing Monte- Carlo method by averaging over the results of at least 10 7 numerical simulations. The decoding bit-error-rate (BER) (average ratio of unrecovered input symbols to total number of input symbols 1−E[n/k]) k = 1000 We set c = 0.9 and δ = 0.1 for SLT and LT codes as proposed in [6]. 25 [6] A. Hagedorn, S. Agarwal, D. Starobinski, and A. Trachtenberg, “Rateless coding with feedback,” IEEE INFOCOM 2009, pp. 1791 –1799, 2009.

26 LT-AF Decoding Error Rate and Runtime 26 γ succ =1.09γ succ =1.14γ succ =1.31

27 LT-AF Decoding Error Rate and Runtime 27 In LT-AF coding the average degree of output symbols is much higher than that of regular LT codes, which causes a higher encoding/decoding complexity.

28 Number of Feedbacks 28 Not only LT-AF codes decrease the required coding overhead for a successful decoding γ succ, but also they need slightly smaller number of feedbacks compared to SLT codes. The total number of feedbacks is much smaller than γ succ k.

29 Robustness to Erasure in Feedback Channel 29 Assume that the loss rate of the feedback channel is ε fb = 0.9 (which is not known to encoder and decoder), hence 90% of the feedbacks are lost in transmission. To the best of our knowledge robustness against feedback loss had not been considered in any existing work and this significantly distinguishes LT-AF codes. Fig. 5. Effect of 90% feedback loss on the performance of SLT and LTAF codes employing VMD. Note that the curves representing LT-AF codes’ performance for ε fb = 0.9 and ε fb = 0.9 fully overlap for all γ ".

30 Conclusions We proposed LT-AF codes that are LT codes with two types of feedback, which alleviate the low performance of LT codes for short data-block lengths. We showed that LT-AF codes require lower coding overhead for successful decoding and lower number of feedback. 30

31 References [6] A. Hagedorn, S. Agarwal, D. Starobinski, and A. Trachtenberg, “Rateless coding with feedback,” IEEE INFOCOM 2009, pp. 1791 – 1799, 2009. [7] A. Kamra, V. Misra, J. Feldman, and D. Rubenstein, “Growth codes: Maximizing sensor network data persistence,” SIGCOMM Computer Communication Rev., vol. 36, no. 4, pp. 255–266, 2006. [8] A. Beimel, S. Dolev, and N. Singer, “RT oblivious erasure correcting,” IEEE/ACM Transactions on Networking, vol. 15, pp. 1321 –1332, dec. 2007. [9] S. Kokalj-Filipovic, P. Spasojevic, E. Soljanin, and R. Yates, “ARQ with doped fountain decoding,” ISSSTA, pp. 780 –784, aug. 2008. [10] J. Sørensen, P. Popovski, and J. Østergaard, “On the Role of Feedback in LT Codes,” Arxiv preprint arXiv:1012.2673, 2010. [11] J. H. Sorensen, T. Koike-Akino, and P. Orlik, “Rateless feedback codes,” in IEEE ISIT, pp. 1767–1771, 2012. 31


Download ppt "LT-AF Codes: LT Codes with Alternating Feedback Ali Talari and Nazanin Rahnavard Oklahoma State University IEEE ISIT (International Symposium on Information."

Similar presentations


Ads by Google