Multiple Description Coding and Distributed Source Coding: Unexplored Connections in Information Theory and Coding Theory S. Sandeep Pradhan Department.

Slides:



Advertisements
Similar presentations
Information Theoretical Security and Secure Network Coding NCIS11 Ning Cai May 14, 2011 Xidian University.
Advertisements

1 Distributed Source Coding Trial Lecture Fredrik Hekland 1. June 2007.
Tomorrow: Uplink Video Transmission Today: Downlink Video Broadcast Changing Landscape of Multimedia Applications.
Achilleas Anastasopoulos (joint work with Lihua Weng and Sandeep Pradhan) April A Framework for Heterogeneous Quality-of-Service Guarantees in.
For stimulus s, have estimated s est Bias: Cramer-Rao bound: Mean square error: Variance: Fisher information How good is our estimate? (ML is unbiased:
1 Department of Electrical Engineering, Stanford University Anne Aaron, Shantanu Rane, David Rebollo-Monedero and Bernd Girod Systematic Lossy Forward.
Lihua Weng Dept. of EECS, Univ. of Michigan Error Exponent Regions for Multi-User Channels.
Fundamental limits in Information Theory Chapter 10 :
Reinventing Compression: The New Paradigm of Distributed Video Coding Bernd Girod Information Systems Laboratory Stanford University.
Fountain Codes Amin Shokrollahi EPFL and Digital Fountain, Inc.
Spatial and Temporal Data Mining
An Integrated Source Transcoding and Congestion Control Paradigm for Video Streaming in the Internet Proposed by R. Puri, K.W. Lee, K. Ramchandran and.
Distributed Video Coding Bernd Girod, Anne Margot Aagon and Shantanu Rane, Proceedings of IEEE, Jan, 2005 Presented by Peter.
Wyner-Ziv Coding of Motion Video
10th Canadian Workshop on Information Theory June 7, 2007 Rank-Metric Codes for Priority Encoding Transmission with Network Coding Danilo Silva and Frank.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
BASiCS Group University of California at Berkeley Generalized Coset Codes for Symmetric/Asymmetric Distributed Source Coding S. Sandeep Pradhan Kannan.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
A Graph-based Framework for Transmission of Correlated Sources over Multiuser Channels Suhan Choi May 2006.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Distributed Video Coding Bernd Girod, Anne Margot Aaron, Shantanu Rane, and David Rebollo-Monedero IEEE Proceedings 2005.
Distributed Video Coding VLBV, Sardinia, September 16, 2005 Bernd Girod Information Systems Laboratory Stanford University.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
Ger man Aerospace Center Gothenburg, April, 2007 Coding Schemes for Crisscross Error Patterns Simon Plass, Gerd Richter, and A.J. Han Vinck.
Linear codes 1 CHAPTER 2: Linear codes ABSTRACT Most of the important codes are special types of so-called linear codes. Linear codes are of importance.
Coding Schemes for Multiple-Relay Channels 1 Ph.D. Defense Department of Electrical and Computer Engineering University of Waterloo Xiugang Wu December.
Network Alignment: Treating Networks as Wireless Interference Channel Chun Meng Univ. of California, Irvine.
Analysis of Iterative Decoding
Repairable Fountain Codes Megasthenis Asteris, Alexandros G. Dimakis IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 32, NO. 5, MAY /5/221.
When rate of interferer’s codebook small Does not place burden for destination to decode interference When rate of interferer’s codebook large Treating.
Abhik Majumdar, Rohit Puri, Kannan Ramchandran, and Jim Chou /24 1 Distributed Video Coding and Its Application Presented by Lei Sun.
MD-based scheme could outperform MR-based scheme while preserving the source- channel interface Rate is not sufficient as source- channel interface, ordering.
Three-layer scheme dominates previous double-layer schemes Distortion-diversity tradeoff provides useful comparison in different operating regions Layered.
M S. Sandeep Pradhan University of Michigan, Ann Arbor joint work with K. Ramchandran Univ. of California, Berkeley A Comprehensive view of duality in.
MIMO continued and Error Correction Code. 2 by 2 MIMO Now consider we have two transmitting antennas and two receiving antennas. A simple scheme called.
Basic Characteristics of Block Codes
§2 Discrete memoryless channels and their capacity function
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
DIGITAL COMMUNICATIONS Linear Block Codes
Advances in digital image compression techniques Guojun Lu, Computer Communications, Vol. 16, No. 4, Apr, 1993, pp
Superposition encoding A distorted version of is is encoded into the inner codebook Receiver 2 decodes using received signal and its side information Decoding.
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
Information Theory Linear Block Codes Jalal Al Roumy.
Computer Science Division
On Coding for Real-Time Streaming under Packet Erasures Derek Leong *#, Asma Qureshi *, and Tracey Ho * * California Institute of Technology, Pasadena,
Perfect and Related Codes
Vector Quantization CAP5015 Fall 2005.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
CS294-9 :: Fall 2003 Joint Source/Channel Coding Ketan Mayer-Patel.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
C.K. Kim, D.Y. Suh, J. Park, B. Jeon ha 強壯 !. DVC bitstream reorganiser.
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
Information Theory for Mobile Ad-Hoc Networks (ITMANET): The FLoWS Project Collision Helps! Algebraic Collision Recovery for Wireless Erasure Networks.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
1 Department of Electrical Engineering, Stanford University Anne Aaron, Shantanu Rane, Rui Zhang and Bernd Girod Wyner-Ziv Coding for Video: Applications.
Secret Sharing in Distributed Storage Systems Illinois Institute of Technology Nexus of Information and Computation Theories Paris, Feb 2016 Salim El Rouayheb.
Samuel Cheng, Shuang Wang and Lijuan Cui University of Oklahoma
Channel Coding: Part I Presentation II Irvanda Kurniadi V. ( ) Digital Communication 1.
IERG6120 Lecture 22 Kenneth Shum Dec 2016.
The Johns Hopkins University
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
Wednesday, Jan 21, 1:30 to 3:10 pm, Session 15 : Image/Video Transmission I (First Talk, Other topics deal with error-resilience and error-concealment)
Information Redundancy Fault Tolerant Computing
Foundation of Video Coding Part II: Scalar and Vector Quantization
Distributed Compression For Binary Symetric Channels
Miguel Griot, Andres I. Vila Casado, and Richard D. Wesel
Unequal Error Protection for Video Transmission over Wireless Channels
Compute-and-Forward Can Buy Secrecy Cheap
Lihua Weng Dept. of EECS, Univ. of Michigan
Presentation transcript:

Multiple Description Coding and Distributed Source Coding: Unexplored Connections in Information Theory and Coding Theory S. Sandeep Pradhan Department of EECS University of Michigan, Ann Arbor (joint work with R. Puri and K. Ramchandran of University of California, Berkeley)

Transmission of sources over packet networks X Encoder Decoder 1 2 n Packet Erasure Network X ^ Best Effort Networks : modeled as packet erasure channels. User Datagram Protocol (UDP) Example: Internet Multimedia over the Internet is growing fast

Multiple Descriptions Source Coding MD Encoder X Description 1 Side Decoder 1 Distortion D 1 Central Decoder D0D0 Description 2 Side Decoder 2 Distortion D 2 R1R1 R2R2 Find the set of all achievable tuples (R 1,R 2,D 1,D 2,D 0 )

Prior Work Cover-El Gamal, ‘80: achievable rate region for 2-channel MD. Ozarow 1981: Converse for Gaussian sources. Berger, Ahlswede, Zhang, ’80-’90. Venkataramani et al, ‘01: extension of cover-el gamal region for n-cannels Information Theory: (incomplete list) Finite-block-length codes: (incomplete list) Vaishampayan ’93: MD scalar and vector quantizers Wang Orchard-Reibman `97: MD transform codes Goyal-Kovacevic `98: frames for MD Puri Ramchandran `01: FEC for MD

Main idea in random codes for 2-channel MD (Cover-El Gamal) p(x 1 ) p(x 2 ) Find a pair of codewords that that jointly typical with source word with respect to p(x,x 1,x 2 ) Possible if: Fix:

Possible ideas for n-channel MD ? Extend Cover-El Gamal random codes from 2 to n: (Venkataramani et al.) Use maximum distance separable erasure (MDS) codes (Albanese et al., ‘95)

Erasure codes Erasure Codes (n, k, d) : Add (n-k) parity symbols MDS Codes : d = n – k + 1 MDS => any k channel symbols => k source symbols. Source ENC Packets CHANNElCHANNEl A subset DEC Source 0 1 k n (# of Packets) Distortion “Cliff” effect

Fix: Use many MDS codes Example for 3-channels: Successively refinable source-encoded bit stream Description 1 Description 2 Description 3 (3,1) (3,2) (3,3) MDS erasure codes Distortion (Albanese et al ’95, Puri-Ramchandran 99)

What is new in our work? Symmetric problem, # of descriptions > 2 Explore a fundamental connection between MD coding and distributed source coding. New rate region for MD: random binning inspired from distributed source coding Constructions for MD: extension of our earlier work (DISCUS) on construction of coset codes for distributed source coding. Outline of our strategy: Start from an MDS erasure code from a different perspective Connect this code to a distributed source coding problem Construct random codes based on this connection: (n,k) source-channel erasure codes New rate region for MD: A concatenation of these (n,k) source-channel erasure codes.

Idea #1: A new look at (n,1,n) MDS codes n - 1 n  (n, 1, n) “bit” code  All packets are identical (repetition)  Reception of any one packet enables reconstruction  Reception of more than one packet does not give better quality  Parity bits wasted…

Idea #1 (contd): (n,1,n) source-channel erasure code n - 1 n Independently quantized versions of X on every packet Reception of any one packet enables reconstruction Reception of more packets enables better reconstruction (estimation gains due to multiple looks!) 

Extensions to (n,k) source-channel codes Can we generalize this to (n,k) source- channel codes? Yes: random binning (coset code) approach ! –Using Slepian-Wolf, Wyner-Ziv Theorems (n,1) code (n,k) code A Conceptual leap using binning

Idea # 2: Consider a (3,2,2) MDS code There is inherent uncertainty at the encoder about which packets are received by the decoder. Needs coding strategy where decoder has access to some information while the encoder does not: distributed source coding

Background: Distributed source coding (Slepian-Wolf ‘73, Wyner-Ziv ‘76, Berger ‘77) Exploiting correlation without direct communication Optimal rate region: Slepian-Wolf 1973 Encoder Decoder X Y X,Y  X and Y => correlated sources

Distributed source coding (Contd) Rate region : Random Partitions of typical sets

Idea # 2 (contd): Is there any telltale signs of symmetric overcomplete partitioning in (3,2,2) MDS codes /1 0/1 0/1

Idea #2 (Contd): Instead of a single codebook, build 3 different codebooks (quantizers) and then partition (overcomplete) them 0 1 k n (# of Packets) Distortion

Problem Formulation X Packet Erasure Channel Decoder 1 2 n Encoder 1 Encoder 2 Encoder n (n,k) source-channel erasure code  Decoder starts reconstruction with m> k packets  Rate of transmission of every packet = same  Distortion => only a function of # of received packets  Symmetric formulation, n >2 X ^

Problem Formulation : Notation Source X ~ q(x), Alphabet, Blocklength=L Bounded distortion measure Encoder: Decoder Distortion with h packets =

Problem Statement (Contd.) What is the best distortion tuple for a rate of R bits/sample/packet? achievable if for some p.m.f. and a set of functions such that Main Result

Example: (3,2) Code (3,2) code: (Y i ) have same p.d.f. –3 codebooks each of rate I(X;Y i ) are constructed randomly. –Each is partitioned into exp 2 (LR) bins and –# of codewords in a bin is exponential in { --.I(Y 1 ;Y 2 )} –Thus 2R= I(X;Y 1 )+ I(X;Y 2 ) - I(Y 1 ;Y 2 ) L 2

Example of a Gaussian Source : (3,2,2) code 1 bit/sample/packet Distortion

n-Channel Symmetric MD: Concatenation of (n,1), (n,2)…(n,n) source-channel erasure codes Idea # 3 Packet 1 Packet 2 Packet 3 Y11 Y12 Y13 (3,1) Y21 Y22 Y23 (3,2) Y3 (3,3) Base Layer: Reception of any one packet => decode (3,1) code. Middle Layer: (3,2) code => side information includes some one part of middle layer (source channel erasure codes!). Also includes some two parts of base layer. Final Layer: (3,3) code => refine everything. Every part of bitstream contributes to source reconstruction contributes to source reconstruction.

Key Concepts: Multiple quantizers which can introduce correlated quantization noise: MD Lattice VQ (Vaishampayan, Sloane, Diggavi ’01) Computationally efficient multiple binning schemes: Symmetric distributed source coding using coset codes (Pradhan-Ramchandran ’00, Schonberg, Pradhan, Ramchadran ‘03) Note: different from single binning schemes: (Zamir-Shamai ’98, Pradhan-Ramchandran ’99)

A (3,2) Source Channel Lattice Code Y1, Y2, Y3 are correlated quantized versions of source X. d(Y i, Y j ) 2.

Code of distance 5 overcomes correlation noise of 2. A (3,2) Source Channel Lattice Code

Partitioning through cosets: constructive counterpart of “random bins ”. A (3,2) Source Channel Lattice Code

Suppose 2 observations Y1 and Y2. Asymmetric case Y2 available at decoder. 21 A (3,2) Source Channel Lattice Code A code that combats correlation noise ensures decoding.

Suppose 2 observations Y1 and Y2. Symmetric case: Split the generator vectors of the code. 1 gets rows, 2 gets columns. 21 A (3,2) Source Channel Lattice Code

Suppose 2 observations Y1 and Y2. Symmetric case: Split the generator vectors of the code. 1 gets rows, 2 gets columns A (3,2) Source Channel Lattice Code

Suppose 2 observations Y1 and Y2. Symmetric case: Split the generator vectors of the code. 1 gets rows, 2 gets columns. 21 A (3,2) Source Channel Lattice Code

1,2,3 are independently quantized versions of source X. d(Y i, Y j ) 2. A (3,2) Source Channel Lattice Code

Find 3 generator vectors such that any two generate the code. 1 gets rows, 2 gets columns, 3 gets diagonal. A (3,2) Source Channel Lattice Code

Find 3 generator vectors such that any two are linearly independent. 1 gets rows, 2 gets columns, 3 gets diagonal A (3,2) Source Channel Lattice Code

Constructions for general n and k Choose a code (generator matrix G) that combats correlation noise. e.g., Split the rows of G into k submatrices (k generator sets S1, …. Sk). e.g., G1 = [5 0] and G2 = [0 5]. Need a way to generate n generator sets out k such that any k of them are equivalent to G. Choose generator matrix M (dim. k x n) of an (n,k) MDS block code. Has the property that any k columns are independent. e.g.,

Constructions for general n and k Using weights from n columns one at a time, linearly combine k generator sets (S1, …., Sk) to come up with n encoding matrices. e.g, G1 = [5 0], G2 = [0 5], G3 = [5 5]. Efficient algorithms for encoding and decoding using coset code framework (Forney 1991).

Conclusions –New rate region for n-channel MD problem –A new connection between MD problem and distributed source coding problem –A new application of multiple binning schemes –Construction based on coset codes –A nice synergy between quantization and MDS erasure codes