Presentation is loading. Please wait.

Presentation is loading. Please wait.

M S. Sandeep Pradhan University of Michigan, Ann Arbor joint work with K. Ramchandran Univ. of California, Berkeley A Comprehensive view of duality in.

Similar presentations


Presentation on theme: "M S. Sandeep Pradhan University of Michigan, Ann Arbor joint work with K. Ramchandran Univ. of California, Berkeley A Comprehensive view of duality in."— Presentation transcript:

1 M S. Sandeep Pradhan University of Michigan, Ann Arbor joint work with K. Ramchandran Univ. of California, Berkeley A Comprehensive view of duality in multiuser source coding and channel coding

2 Acknowledgements: Jim Chou, Univ. of California Phillip Chou, Microsoft Research David Tse, Univ. of California Pramod Viswanath, Univ. of Illinois Michael Gastpar, Univ. of California Prakash Ishwar, Univ. of California Martin Vetterli, EPFL

3 Outline Motivation, related work and background Duality between source and channel coding –Role of source distortion measure & channel cost measure Extension to the case of side information MIMO source coding and channel coding with one-sided collaboration Future work: Extensions to multiuser joint source- channel coding Conclusions

4 Motivation Expanding applications of MIMO source and channel coding Explore a unifying thread to these diverse problems We consider SCSI and CCSI as functional duals We consider 1. Distributed source coding 2. Broadcast channel coding 3. Multiple description source coding 4. Multiple access channel coding Functional dual

5 It all starts with Shannon “There is a curious and provocative duality between the properties of a source with a distortion measure and those of a channel. This duality is enhanced if we consider channels in which there is a “cost” associated with the different input letters, and it is desired to find the capacity subject to the constraint that the expected cost not exceed a certain quantity…..”

6 Related work (incomplete list) Duality between source coding and channel coding: Shannon (1959) Csiszar and Korner (textbook, 1981) Cover & Thomas (textbook: 1991): covering vs. packing Eyuboglu and Forney (1993): quantizing vs. modulation: boundary/granular gains vs. shaping/coding gains Laroia, Farvardin & Tretter (1994): SVQ versus shell mapping Duality between source coding with side information (SCSI) and channel coding with side information (CCSI): Chou, Pradhan & Ramchandran (1999) Barron, Wornell and Chen (2000) Su, Eggers & Girod (2000) Cover and Chiang (2001)

7 Notation: Source coding: Encoder XX Decoder X ^ Source alphabet Distribution Reconstruction alphabet Distortion measure Distortion constraint D: Encoder: Decoder Rate-distortion function R(D)= Minimum rate of representing X with distortion D:

8 Channel coding: Encoder m Decoder m ^ Input and output alphabets, Conditional distribution Cost measure Cost constraint W: Encoder: Decoder Capacity-cost function C(W)= Maximum rate of communication with cost W: Channel Source encoder and channel decoder have mapping with the same domain and range. Similarly, channel encoder and source decoder have the same domain and range.

9 Gastpar, Rimoldi & Vetterli ’00: To code or not to code? Encoder Channel Decoder S XY S ^ Source: p(s) Channel: p(y|x) For a given pair of p(s) and p(y|x), there exist a distortion measure and a cost measure such that uncoded mappings at the encoder and decoder are optimal in terms of end-to-end achievable performance. Encoder: f(.) Decoder: g(.) Bottom line: Any source can be “matched” optimally to any channel if you are allowed to pick the distortion & cost measures for the source & channel. Inspiration for cost function/distortion measure analysis:

10 X Quantizer Role of distortion measures: (Fact 1) Given a source: Let be some arbitrary quantizer. Then there exists a distortion measure such that: and Bottom line: any given quantizer is the optimal quantizer for any source provided you are allowed to pick the distortion measure

11 Given a channel: Let be some arbitrary input distribution. Then there exists a cost measure such that: and Bottom line: any given input distribution is the optimal input for any channel provided you are allowed to pick the cost measure X Channel Role of cost measures: (Fact 2) Now we are ready to characterize duality

12 Theorem 1a: For a given source coding problem with source distortion measure, distortion constraint D, let the optimal quantizer be inducing the distributions (using Bayes’ rule): Optimal Quantizer Duality between classical source and channel coding:

13 Then a unique dual channel coding problem with channel input alphabet, output alphabet X, cost measure and cost constraint W, such that: (i)R(D)=C(W); (ii) whereand Optimal Quantizer Channel REVERSAL OF ORDER

14 Interpretation of functional duality For a given source coding problem, we can associate a specific channel coding problem such that both problems induce the same optimal joint distribution the optimal encoder for one is functionally identical to the optimal decoder for the other in the limit of large block length an appropriate channel-cost measure is associated Source coding: distortion measure is as important as the source distribution Channel coding: cost measure is as important as the channel conditional distribution

15 Source coding with side information: EncoderDecoder X S X ^ The encoder needs to compress the source X. The decoder has access to correlated side information S. Studied by Slepian-Wolf ‘73, Wyner-Ziv ’76 Berger ’77 Applications: sensor networks, digital upgrade, diversity coding for packet networks

16 Encoder X S X ^ Encoder has access to some information S related to the statistical nature of the channel. Encoder wishes to communicate over this cost-constrained channel Studied by Gelfand-Pinsker ‘81, Costa ‘83, Heegard-El Gamal ‘85 Applications: watermarking, data hiding, precoding for known interference, multiantenna broadcast channels. Channel Decoder Channel coding with side information: mm ^

17 Duality (loose sense) CCSI Side information at encoder only Channel code is “partitioned” into a bank of source codes SCSI Side info. at decoder only Source code is “partitioned” into a bank of channel codes

18 Conditional source Side information Context-dependent distortion measure Encoder Decoder Source coding with side information at decoder (SCSI): (Wyner-Ziv ’76) S Encoder X Decoder X ^ U U Rate-distortion function: such that Intuition (natural Markov chains): side information S is not present at the encoder source X is not present at the decoder Completely determines the optimal joint distribution Note: ^

19 SCSI: Gaussian example: (reconstruction of (X-S)): Conditional source: X=S+V, p(v)~N(0,N) Side information: p(s)~N(0,Q) Distortion measure: (mean squared error reconstruction of (x-s)) + + + X q U S Z X Encoder Test channel Decoder + (MMSE estimator)

20 Conditional channel Side information Cost measure Encoder Decoder Channel coding with side information at encoder (CCSI): Encoder U Decoder S U Capacity-Cost function: such that channel does not care about U encoder does not have access to X Intuition (natural Markov chains): Completely determines the optimal joint distribution (Gelfand-Pinsker ’81)

21 CCSI: Gaussian example (known interference): Conditional channel: Side information: Distortion measure: ( power constraint on ) + q Decoder + + U S Z X Channel Encoder U + (MMSE precoder) (Costa ’83)

22 U X Encoder Test channel X + q Decoder + S Z + + + q U Encoder Channel Decoder SCSI CCSI

23 Theorem 2a: Given: Inducing: (natural CCSI constraint) X U X Encoder Induced test channel Decoder If : Find optimal: that minimizes using Bayes’ rule is satisfied

24 (i) Rate-distortion bound = capacity-cost bound (ii) achieve capacity-cost optimality (iii) and Channel= Side information =Cost measure= => a dual CCSI with X U X Encoder Induced test channel Decoder U X Channel Encoder Decoder U Cost constraint=W

25 Enc. X U SCSI Dec. S X ^ U U CCSI Enc. U S X ^ X Ch. Markov chains and duality SCSI CCSI p(s,x,u,x) ^ DUALITY

26 Duality implication: Generalization of Wyner-Ziv no-rate-loss case CCSI:( Cohen-Lapidoth, 2000, Erez-Shamai-Zamir, 2000 ) extension of Costa’s result for to arbitrary S with no rate-loss + + S Z X Channel Encoder Decoder U U New result: Wyner-Ziv’s no rate loss result can be extended to arbitrary source and side information as long as X=S+V, where V is Gaussian, for MSE distortion measure. ^ Encoder X Decoder X U U S

27 Functional duality in MIMO source and channel coding with one-sided collaboration : For ease of illustration, we consider 2-input-2-output system Consider only sum-rate, and single distortion/cost measure We consider functional duality in the distributional sense Future & on-going work: duality in the coding sense.

28 MIMO source coding with one-sided collaboration: Encoder-1 Encoder-2 Decoder-1 Decoder-2 Test Channel Either the encoders or the decoders (but not both) collaborate MIMO channel coding with one-sided collaboration: Encoder-1 Encoder-2 Decoder-1 Decoder-2 Channel Either the encoders or the decoders (but not both) collaborate

29 Distributed source coding Two correlated sources with given joint distribution joint distortion measure Encoders DO NOT collaborate, Decoders DO collaborate Problem: For a given joint distortion D, find the minimum sum-rate R Achievable rate region (Berger ‘77) Encoder-1 Encoder-2 Decoder-1 Decoder-2 Test Channel

30 Distributed source coding: Achievable sum-rate region: such that E[d]<D 1.Two sources can not see each other 2.The decoder can not see the source

31 Broadcast channel coding Broadcast channel with a given conditional distribution joint cost measure Encoders DO collaborate, Decoders DO NOT collaborate Problem: For a given joint cost W, find the maximum sum-rate R Achievable rate region (Marton ’79) Encoder-1 Encoder-2 Decoder-1 Decoder-2 Channel

32 Broadcast Channel Coding: Achievable sum-rate region: such that E[w]<W 1.Channel only cares about i/p 2. Encoder does not have the channel o/p

33 Duality (loose sense) in Distr. Source coding and Broadcast channel Distributed source coding Collaboration at decoder only Uses Wyner-Ziv coding: source code is “partitioned” into a bank of channel codes Broadcast channel coding Collaboration at encoder only Uses Gelfand-Pinsker coding: channel code is “partitioned” into a bank of source codes

34 Dist. Source Coding Broadcast Channel Coding DUALITY Theorem 3a:

35 Example: 2-in-2-out Gaussian Linear Channel: (Caire, Shamai, Yu, Cioffi, Viswanath, Tse) H + + Marton’s sum-rate is shown to be tight Using Sato’s bound => the capacity of Broadcast channel depends only on marginals. For optimal i/p distribution, if we keep the variance of the noise the same and change the correlation,at one point we get (also called worst-case noise). At this point we have duality!

36 Multiple access channel coding with independent message sets Encoder-1 Encoder-2 Decoder-1 Decoder-2 Multiple access channel with a given conditional distribution joint cost measure Encoders DO NOT collaborate, Decoders DO collaborate Problem: For a given joint cost W, find the maximum sum-rate R Capacity-cost function (Ahlswede ’71): such that are independent Channel

37 Multiple description source coding problem: Encoder Decoder-1 Decoder-2 Decoder-0 Encoder Decoder-1 Decoder-2 Decoder-0 Another version with essentially the same coding techniques, which is “amenable” to duality:

38 “Multiple Description Source Coding with no-excess sum-rate” Encoder-1 Encoder-2 Decoder-1 Decoder-2 Two correlated sources with given joint distribution joint distortion measure Encoders DO collaborate, Decoders DO NOT collaborate Problem: For a given joint distortion D, find the minimum sum-rate R Rate-distortion region (Ahlswede ‘85): such that are independent Test Channel

39 Duality (loose sense) in Multiple description coding and multiple access channel MD coding with no excess sum-rate Collaboration at encoder only Uses successive refinement strategy MAC with independent message sets Collaboration at decoder only Uses successive cancellation strategy

40 Theorem 4a: For a multiple description coding with no excess sum-rate with Given: Source alphabets: Reconstruction alphabets Find the optimal conditional distribution Induces Then there exists a multiple access channel with: Channel distribution: Input alphabets: Output alphabets: Joint cost measure:

41 1) sum-rate-distortion bound sum capacity-cost bound 2) achieve optimality for this MA channel coding problem 3) Joint cost measure is Similarly, for a given MA channel coding problem with independent message sets => a dual MD source coding problem with no excess sum-rate.

42 Example: Given a MA channel: H + + Sum-Capacity optimization: => H + + A + + Channel Decoder =>

43 Dual MD coding problem: H + + A + + Encoder Test Channel

44 What is addressed in this work: Duality in empirical per-letter distributions Extension of Wyner-Ziv no-rate loss result to more arbitrary cases Underlying connection between 4 multiuser communication problems What is left to be addressed: Duality in optimal source codes and channel codes Rate-loss in dual problems Joint source-channel coding in dual problems

45 Conclusions Distributional relationship between MIMO source & channel coding Functional characterization: swappable encoder and decoder codebooks Highlighted the importance of source distortion and channel cost measures Cross-leveraging of advances in the applications of these fields


Download ppt "M S. Sandeep Pradhan University of Michigan, Ann Arbor joint work with K. Ramchandran Univ. of California, Berkeley A Comprehensive view of duality in."

Similar presentations


Ads by Google