Presentation is loading. Please wait.

Presentation is loading. Please wait.

Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.

Similar presentations


Presentation on theme: "Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference."— Presentation transcript:

1 Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference April 3, 2002

2 Overview Introduction Turbo Coder and Decoder Compression of Binary Sequences Extension to Continuous-valued Sequences Joint Source-Channel Coding Conclusion Compression with Side Information Using Turbo CodesApril 3, 2002

3 Distributed Source Coding Compression with Side Information Using Turbo CodesApril 3, 2002 Encoder Decoder Statistically dependent Slepian-Wolf Theorem

4 Research Problem Motivation –Slepian-Wolf theorem: It is possible to compress statistically dependent signals in a distributed manner to the same rate as with a system where the signals are compressed jointly. Objective –Design practical codes which achieve compression close to the Slepian-Wolf bound Compression with Side Information Using Turbo CodesApril 3, 2002

5 Asymmetric Scenario – Compression with Side Information Compression with Side Information Using Turbo CodesApril 3, 2002 Encoder Decoder Statistically dependent Compression techniques to send at rate close to H(Y) are well known Can perform some type of switching for more symmetric rates

6 Our Approach: Turbo Codes Turbo Codes –Developed for channel coding –Perform close to Shannon channel capacity limit (Berrou, et al., 1993) Similar work –Garcia-Frias and Zhao, 2001 (Univ. of Delaware) –Bajcsy and Mitran, 2001 (McGill Univ.) Compression with Side Information Using Turbo CodesApril 3, 2002

7 System Set-up X and Y are i.i.d binary sequences X 1 X 2 …X L and Y 1 Y 2 …Y L with equally probable ones and zeros. Let X i be independent of Y j for i  j, but dependent on Y i. X and Y dependency described by pmf P(x|y). Y is sent at rate R Y  H(Y) and is available as side information at the decoder Compression with Side Information Using Turbo CodesApril 3, 2002 Encoder Decoder Statistically dependent

8 Turbo Coder Compression with Side Information Using Turbo CodesApril 3, 2002 Interleaver length L L bits in L bits Systematic Convolutional Encoder Rate bits Discarded Systematic Convolutional Encoder Rate bits L bits Discarded

9 Turbo Decoder Compression with Side Information Using Turbo CodesApril 3, 2002 Interleaver length L L bits out Channel probabilities calculations bits in Channel probabilities calculations bits in SISO Decoder P channel P extrinsic P a priori Interleaver length L Deinterleaver length L SISO Decoder P channel P extrinsic P a priori Deinterleaver length L Decision P a posteriori

10 Simulation: Binary Sequences X - Y relationship – P(X i =Y i )=1-p and P(X i  Y i )=p System –16-state, Rate 4/5 constituent convolutional codes; – R X =0.5 bit per input bit with no puncturing –Theoretically, must be able to send X without error when H(X|Y)  0.5 Compression with Side Information Using Turbo CodesApril 3, 2002

11 Results: Compression of Binary Sequences Compression with Side Information Using Turbo CodesApril 3, 2002 R X =0.5 0.15 bit

12 Results for different rates Punctured the parity bits to achieve lower rates Compression with Side Information Using Turbo CodesApril 3, 2002 0.1250.0890.0361.4041.0330.563 0.2500.1850.0651.3511.0550.625 0.5000.3460.1541.4451.1140.750

13 Extension to Continuous-Valued Sequences Compression with Side Information Using Turbo CodesApril 3, 2002 X and Y are sequences of i.i.d continuous-valued random variables X 1 X 2 …X L and Y 1 Y 2 …Y L. Let X i be independent of Y j for i  j, but dependent on Y i. X and Y dependency described by pdf f (x|y). Y is known as side information at the decoder Interleaver length L L values Quantize to 2 M levels L symbols Convert to bits ML bits Turbo Coder To decoder

14 Simulation: Gaussian Sequences X - Y relationship –X is a sequence of i.i.d Gaussian random variables –Y i =X i +Z i, where Z is also a sequence of i.i.d Gaussian random variables, independent of X. f(x|y) is a Gaussian probability density function System –4-level Lloyd-Max scalar quantizer –16-state, rate 4/5 constituent convolutional codes –No puncturing so rate is 1 bit/source sample Compression with Side Information Using Turbo CodesApril 3, 2002

15 Compression with Side Information Using Turbo CodesApril 3, 2002 Results: Compression of Gaussian Sequences CSNR = ratio of the variance of X and Z R X =1 bit/sample 2.8 dB

16 Joint Source-Channel Coding Assume that the parity bits pass through a memoryless channel with capacity C We can include the channel statistics in the decoder calculations for P channel. From Slepian-Wolf theorem and definition of Channel capacity Compression with Side Information Using Turbo CodesApril 3, 2002

17 Compression with Side Information Using Turbo CodesApril 3, 2002 Results: Joint Source-Channel Coding R X =0.5 BSC with q=0.03 0.12 bit 0.15 bit

18 Conclusion We can use turbo codes for compression of binary sequences. Can perform close to the Slepian-Wolf bound for lossless distributed source coding. We can apply the system for compression of distributed continuous-valued sequences. Performs better than previous techniques. Easy extension to joint source-channel coding Compression with Side Information Using Turbo CodesApril 3, 2002


Download ppt "Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference."

Similar presentations


Ads by Google