Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multiple Description Coding and Distributed Source Coding: Unexplored Connections in Information Theory and Coding Theory S. Sandeep Pradhan Department.

Similar presentations


Presentation on theme: "Multiple Description Coding and Distributed Source Coding: Unexplored Connections in Information Theory and Coding Theory S. Sandeep Pradhan Department."— Presentation transcript:

1 Multiple Description Coding and Distributed Source Coding: Unexplored Connections in Information Theory and Coding Theory S. Sandeep Pradhan Department of EECS University of Michigan, Ann Arbor (joint work with R. Puri and K. Ramchandran of University of California, Berkeley)

2 Transmission of sources over packet networks X Encoder Decoder 1 2 n Packet Erasure Network X ^ Best Effort Networks : modeled as packet erasure channels. User Datagram Protocol (UDP) Example: Internet Multimedia over the Internet is growing fast

3 Multiple Descriptions Source Coding MD Encoder X Description 1 Side Decoder 1 Distortion D 1 Central Decoder D0D0 Description 2 Side Decoder 2 Distortion D 2 R1R1 R2R2 Find the set of all achievable tuples (R 1,R 2,D 1,D 2,D 0 )

4 Prior Work Cover-El Gamal, ‘80: achievable rate region for 2-channel MD. Ozarow 1981: Converse for Gaussian sources. Berger, Ahlswede, Zhang, ’80-’90. Venkataramani et al, ‘01: extension of cover-el gamal region for n-cannels Information Theory: (incomplete list) Finite-block-length codes: (incomplete list) Vaishampayan ’93: MD scalar and vector quantizers Wang Orchard-Reibman `97: MD transform codes Goyal-Kovacevic `98: frames for MD Puri Ramchandran `01: FEC for MD

5 Main idea in random codes for 2-channel MD (Cover-El Gamal) p(x 1 ) p(x 2 ) Find a pair of codewords that that jointly typical with source word with respect to p(x,x 1,x 2 ) Possible if: Fix:

6 Possible ideas for n-channel MD ? Extend Cover-El Gamal random codes from 2 to n: (Venkataramani et al.) Use maximum distance separable erasure (MDS) codes (Albanese et al., ‘95)

7 Erasure codes Erasure Codes (n, k, d) : Add (n-k) parity symbols MDS Codes : d = n – k + 1 MDS => any k channel symbols => k source symbols. Source ENC Packets CHANNElCHANNEl A subset DEC Source 0 1 k n (# of Packets) Distortion “Cliff” effect

8 Fix: Use many MDS codes Example for 3-channels: Successively refinable source-encoded bit stream Description 1 Description 2 Description 3 (3,1) (3,2) (3,3) MDS erasure codes Distortion 1 2 3 (Albanese et al ’95, Puri-Ramchandran 99)

9 What is new in our work? Symmetric problem, # of descriptions > 2 Explore a fundamental connection between MD coding and distributed source coding. New rate region for MD: random binning inspired from distributed source coding Constructions for MD: extension of our earlier work (DISCUS) on construction of coset codes for distributed source coding. Outline of our strategy: Start from an MDS erasure code from a different perspective Connect this code to a distributed source coding problem Construct random codes based on this connection: (n,k) source-channel erasure codes New rate region for MD: A concatenation of these (n,k) source-channel erasure codes.

10 Idea #1: A new look at (n,1,n) MDS codes 1 2 3 n - 1 n  (n, 1, n) “bit” code  All packets are identical (repetition)  Reception of any one packet enables reconstruction  Reception of more than one packet does not give better quality  Parity bits wasted…

11 Idea #1 (contd): (n,1,n) source-channel erasure code 1 2 3 n - 1 n Independently quantized versions of X on every packet Reception of any one packet enables reconstruction Reception of more packets enables better reconstruction (estimation gains due to multiple looks!) 

12 Extensions to (n,k) source-channel codes Can we generalize this to (n,k) source- channel codes? Yes: random binning (coset code) approach ! –Using Slepian-Wolf, Wyner-Ziv Theorems (n,1) code (n,k) code A Conceptual leap using binning

13 Idea # 2: Consider a (3,2,2) MDS code There is inherent uncertainty at the encoder about which packets are received by the decoder. Needs coding strategy where decoder has access to some information while the encoder does not: distributed source coding

14 Background: Distributed source coding (Slepian-Wolf ‘73, Wyner-Ziv ‘76, Berger ‘77) Exploiting correlation without direct communication Optimal rate region: Slepian-Wolf 1973 Encoder Decoder X Y X,Y  X and Y => correlated sources

15 Distributed source coding (Contd) Rate region : 7654376543 Random Partitions of typical sets

16 Idea # 2 (contd): Is there any telltale signs of symmetric overcomplete partitioning in (3,2,2) MDS codes 00 01 10 11 00 01 10 11 00 10 01 11 00 11 01 10 0/1 0/1 0/1

17 Idea #2 (Contd): Instead of a single codebook, build 3 different codebooks (quantizers) and then partition (overcomplete) them 0 1 k n (# of Packets) Distortion

18 Problem Formulation X Packet Erasure Channel Decoder 1 2 n Encoder 1 Encoder 2 Encoder n (n,k) source-channel erasure code  Decoder starts reconstruction with m> k packets  Rate of transmission of every packet = same  Distortion => only a function of # of received packets  Symmetric formulation, n >2 X ^

19 Problem Formulation : Notation Source X ~ q(x), Alphabet, Blocklength=L Bounded distortion measure Encoder: Decoder Distortion with h packets =

20 Problem Statement (Contd.) What is the best distortion tuple for a rate of R bits/sample/packet? achievable if for some p.m.f. and a set of functions such that Main Result

21 Example: (3,2) Code (3,2) code: (Y i ) have same p.d.f. –3 codebooks each of rate I(X;Y i ) are constructed randomly. –Each is partitioned into exp 2 (LR) bins and –# of codewords in a bin is exponential in { --.I(Y 1 ;Y 2 )} –Thus 2R= I(X;Y 1 )+ I(X;Y 2 ) - I(Y 1 ;Y 2 ) L 2

22 Example of a Gaussian Source : (3,2,2) code 1 bit/sample/packet Distortion

23 n-Channel Symmetric MD: Concatenation of (n,1), (n,2)…(n,n) source-channel erasure codes Idea # 3 Packet 1 Packet 2 Packet 3 Y11 Y12 Y13 (3,1) Y21 Y22 Y23 (3,2) Y3 (3,3) Base Layer: Reception of any one packet => decode (3,1) code. Middle Layer: (3,2) code => side information includes some one part of middle layer (source channel erasure codes!). Also includes some two parts of base layer. Final Layer: (3,3) code => refine everything. Every part of bitstream contributes to source reconstruction contributes to source reconstruction.

24 Key Concepts: Multiple quantizers which can introduce correlated quantization noise: MD Lattice VQ (Vaishampayan, Sloane, Diggavi ’01) Computationally efficient multiple binning schemes: Symmetric distributed source coding using coset codes (Pradhan-Ramchandran ’00, Schonberg, Pradhan, Ramchadran ‘03) Note: different from single binning schemes: (Zamir-Shamai ’98, Pradhan-Ramchandran ’99)

25 A (3,2) Source Channel Lattice Code Y1, Y2, Y3 are correlated quantized versions of source X. d(Y i, Y j ) 2.

26 Code of distance 5 overcomes correlation noise of 2. A (3,2) Source Channel Lattice Code

27

28 Partitioning through cosets: constructive counterpart of “random bins ”. A (3,2) Source Channel Lattice Code

29 Suppose 2 observations Y1 and Y2. Asymmetric case Y2 available at decoder. 21 A (3,2) Source Channel Lattice Code A code that combats correlation noise ensures decoding.

30 Suppose 2 observations Y1 and Y2. Symmetric case: Split the generator vectors of the code. 1 gets rows, 2 gets columns. 21 A (3,2) Source Channel Lattice Code

31 Suppose 2 observations Y1 and Y2. Symmetric case: Split the generator vectors of the code. 1 gets rows, 2 gets columns A (3,2) Source Channel Lattice Code

32 Suppose 2 observations Y1 and Y2. Symmetric case: Split the generator vectors of the code. 1 gets rows, 2 gets columns. 21 A (3,2) Source Channel Lattice Code

33 1,2,3 are independently quantized versions of source X. d(Y i, Y j ) 2. A (3,2) Source Channel Lattice Code

34 Find 3 generator vectors such that any two generate the code. 1 gets rows, 2 gets columns, 3 gets diagonal. A (3,2) Source Channel Lattice Code

35 Find 3 generator vectors such that any two are linearly independent. 1 gets rows, 2 gets columns, 3 gets diagonal. 3 2 1 A (3,2) Source Channel Lattice Code

36 Constructions for general n and k Choose a code (generator matrix G) that combats correlation noise. e.g., Split the rows of G into k submatrices (k generator sets S1, …. Sk). e.g., G1 = [5 0] and G2 = [0 5]. Need a way to generate n generator sets out k such that any k of them are equivalent to G. Choose generator matrix M (dim. k x n) of an (n,k) MDS block code. Has the property that any k columns are independent. e.g.,

37 Constructions for general n and k Using weights from n columns one at a time, linearly combine k generator sets (S1, …., Sk) to come up with n encoding matrices. e.g, G1 = [5 0], G2 = [0 5], G3 = [5 5]. Efficient algorithms for encoding and decoding using coset code framework (Forney 1991).

38 Conclusions –New rate region for n-channel MD problem –A new connection between MD problem and distributed source coding problem –A new application of multiple binning schemes –Construction based on coset codes –A nice synergy between quantization and MDS erasure codes


Download ppt "Multiple Description Coding and Distributed Source Coding: Unexplored Connections in Information Theory and Coding Theory S. Sandeep Pradhan Department."

Similar presentations


Ads by Google