Presentation is loading. Please wait.

Presentation is loading. Please wait.

Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,

Similar presentations


Presentation on theme: "Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,"— Presentation transcript:

1 Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan, Ann Arbor TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A AA A A A A A A A

2 Presentation Overview Problem Formulation Motivation Nested Linear Codes Main Result Applications and Examples Conclusions

3 Problem Formulation Distributed Source Coding Typical application: Sensor networks. Example: Lossless reconstruction of all sources – joint entropy.

4 Problem Formulation We ask: What if the decoder is interested only in a function of the sources? In general: fidelity criterion of the form Ex: average of the sensor measurements. Obvious strategy: Reconstruct the sources and then compute the function. Are rate gains possible if we directly encode the function in a distributed setting?

5 Motivation: A Binary Example Korner and Marton – Reconstruction of Centralized encoder: –Compute –Compress using a good source encoder Suppose satisfies Centralized scheme becomes distributed scheme. Are there good source codes with this property? –Linear Codes.

6 The Korner-Marton Coding Scheme matrix such that: –Decoder with high probability. –Entropy achieving: Encoders transmit Decoder: with high probability. Rate pair achievable. Can be lower than Slepian-Wolf bound: Scheme works for addition in any finite field.

7 Properties of the Linear Code Matrix :Puts different typical in different bins. Consider - Coset code Good channel code for channel with noise Both encoders use identical codebooks –Binning completely “correlated” –Independent binning more prevalent in information theory.

8 Slepian-Wolf Coding Function to be reconstructed Treat binary sources as sources. Function equivalent to addition in : Encode the vector function one digit at a time. 01 000 111 01 001 101 First digit of Second digit of

9 Slepian-Wolf Coding contd. Use Korner-Marton coding scheme on each digit plane. Sequential strategy achieves Slepian-Wolf bound. General lossless strategy: –“Embed” the function in a digit plane field (DPF). –DPF – direct sum of Galois fields of prime order. –Encode the digits sequentially using Korner-Marton strategy.

10 Lossy Coding Quantize to, to - best estimate of w.r.t the distortion measure given Use lossless coding to encode What we need: Nested linear codes.

11 Nested Linear Codes Codes used in KM, SW – good channel codes –Cosets bin the entire space. –Suitable for lossless coding. Lossy coding: Need to quantize first. –Decrease coset density.

12 Nested Linear Codes Codes used in KM, SW – good channel codes –Cosets bin the entire space. –Suitable for lossless coding. Lossy coding: Need to quantize first. –Decrease coset density – Nested linear codes. –Fine code: quantizes the source. –Coarse code: bins only the fine code.

13 Nested Linear Codes Linear code nested if We need – : “good” source code Can find jointly typical with – :“good” channel code Can find unique typical for a given

14 Good Linear Source Codes Good linear code for the triple Assume for some prime Exists for large if Not a good source code in the Shannon sense. –Contains a subset that is a good Shannon source code. Linearity – rate loss of bits/sample

15 Good Linear Channel Codes Good linear code for the triple Assume for some prime Exists for large if Not a good channel code in the Shannon sense. –Every coset contains a subset which is a good channel code. Linearity – rate loss of bits/sample

16 Main Result Fix test channel such that and Embed in. Need to encode Fix order of encoding of digit planes – Idea: Encode one digit at a time. At b th stage: Use previous reconstructed digits as side information.

17 Coding Strategy for Good source codes, good channel code

18 Cardinalities of the Linear Code Cardinality of the nested codes Rate of encoder: Conventional coding:

19 Coding Theorem An achievable rate region Corollary:

20 Nested Linear Codes Achieve Rate Distortion Bound Choose as constant. Follows that achievable for any Can also recover –Berger-Tung inner bound. –Wyner-Ziv rate region. –Wyner’s source coding with side information. –Slepian-Wolf and Korner Marton rate regions.

21 Lossy Coding of Fix test channels independent binary random variables. Reconstruct Using corollary to rate region, can achieve Can achieve more rate points by –Choosing more general test channels. –Embedding in

22 Conclusions Presented an unified approach to distributed source coding. Involves use of nested linear codes. Coding: Quantization followed by “correlated” binning. Recovers the known rate regions for many problems. Presents new rate regions for other problems.


Download ppt "Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,"

Similar presentations


Ads by Google