Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.

Slides:



Advertisements
Similar presentations
Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Advertisements

VSMC MIMO: A Spectral Efficient Scheme for Cooperative Relay in Cognitive Radio Networks 1.
Sampling and Pulse Code Modulation
Information Theory EE322 Al-Sanie.
Hybrid Codes and the Point-to-Point Channel Paul Cuff Curt Schieler.
Capacity of Wireless Channels
Enhancing Secrecy With Channel Knowledge
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
1 Distributed Source Coding Trial Lecture Fredrik Hekland 1. June 2007.
Information Theory Introduction to Channel Coding Jalal Al Roumy.
Chapter 6 Information Theory
Fundamental limits in Information Theory Chapter 10 :
Threshold Phenomena and Fountain Codes
Distributed Video Coding Bernd Girod, Anne Margot Aagon and Shantanu Rane, Proceedings of IEEE, Jan, 2005 Presented by Peter.
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
A Graph-based Framework for Transmission of Correlated Sources over Multiuser Channels Suhan Choi May 2006.
How to Turn on The Coding in MANETs Chris Ng, Minkyu Kim, Muriel Medard, Wonsik Kim, Una-May O’Reilly, Varun Aggarwal, Chang Wook Ahn, Michelle Effros.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Distributed Video Coding Bernd Girod, Anne Margot Aaron, Shantanu Rane, and David Rebollo-Monedero IEEE Proceedings 2005.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
Noise, Information Theory, and Entropy
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Noise, Information Theory, and Entropy
STATISTIC & INFORMATION THEORY (CSNB134)
Information Theory & Coding…
Repairable Fountain Codes Megasthenis Asteris, Alexandros G. Dimakis IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 32, NO. 5, MAY /5/221.
Rate-distortion Theory for Secrecy Systems
EEET 5101 Information Theory Chapter 1
When rate of interferer’s codebook small Does not place burden for destination to decode interference When rate of interferer’s codebook large Treating.
Information and Coding Theory Linear Block Codes. Basic definitions and some examples. Juris Viksna, 2015.
MD-based scheme could outperform MR-based scheme while preserving the source- channel interface Rate is not sufficient as source- channel interface, ordering.
Three-layer scheme dominates previous double-layer schemes Distortion-diversity tradeoff provides useful comparison in different operating regions Layered.
Channel Capacity.
User Cooperation via Rateless Coding Mahyar Shirvanimoghaddam, Yonghui Li, and Branka Vucetic The University of Sydney, Australia IEEE GLOBECOM 2012 &
§3 Discrete memoryless sources and their rate-distortion function §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion.
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
Redundancy The object of coding is to introduce redundancy so that even if some of the information is lost or corrupted, it will still be possible to recover.
Cross-Layer Optimization in Wireless Networks under Different Packet Delay Metrics Chris T. K. Ng, Muriel Medard, Asuman Ozdaglar Massachusetts Institute.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
DIGITAL COMMUNICATIONS Linear Block Codes
Superposition encoding A distorted version of is is encoded into the inner codebook Receiver 2 decodes using received signal and its side information Decoding.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
STATISTIC & INFORMATION THEORY (CSNB134) MODULE 11 COMPRESSION.
Baseband Receiver Receiver Design: Demodulation Matched Filter Correlator Receiver Detection Max. Likelihood Detector Probability of Error.
Joint Moments and Joint Characteristic Functions.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
A Low-Complexity Universal Architecture for Distributed Rate-Constrained Nonparametric Statistical Learning in Sensor Networks Avon Loy Fernandes, Maxim.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Samuel Cheng, Shuang Wang and Lijuan Cui University of Oklahoma
Entropy estimation and lossless compression Structure and Entropy of English How much lossless compression can be achieved for a given image? How much.
Channel Coding: Part I Presentation II Irvanda Kurniadi V. ( ) Digital Communication 1.
Distributed Joint Source-Channel Coding on a Multiple Access Channel with Side Information Vinod Sharma.
Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its.
The Viterbi Decoding Algorithm
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
Subject Name: Information Theory Coding Subject Code: 10EC55
Independent Encoding for the Broadcast Channel
Layered Decoding and Secrecy Over Degraded Broadcast Channel
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
Watermarking with Side Information
Presentation transcript:

Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE

DPC : dirty paper coding CSI : channel state information CL : common layer RL : refinement layer LDS : Layered Description scheme

Introduction Background and notation A basic WZBC scheme A layered WZBC scheme Source coding rates for LDS PERFORMANCE ANALYSIS FOR THE QUADRATIC GAUSSIAN PROBLEM PERFORMANCE ANALYSIS FOR THE BINARY HAMMING PROBLEM CONCLUSIONS AND FUTURE WORK

We study the communication scenario in which one of the sensors is required to transmit its measurements to the other nodes over a broadcast channel. The receiver nodes are themselves equipped with side information unavailable to the sender,

lossless transmission (in the Shannon sense) is possible with channel uses per source symbol if and only if there exists a channel input distribution such that

The optimal coding scheme is not separable in the clas- sical sense, but consists of separate components that per- form source and channel coding in a broader sense. This results in the separation of source and channel variables as in (1).

If the broadcast channel is such that the same input distri- bution achieves capacity for all individual channels, then (1) implies that one can utilize all channels at full ca- pacity. Binary symmetric channels and Gaussian channels are the widely known examples of this phenomenon.

The optimal coding scheme does not explicitly involve binning,

A binary symmetric channel (or BSC) is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit (a zero or a one), and the receiver receives a bit. It is assumed that the bit is usually transmittedcorrectly, but that it will be "flipped" with a small probability (the "crossover probability"). This channel is used frequently in information theory because it is one of the simplest channels to analyze.communications channelcoding theoryinformation theorybittransmittedprobability

In this paper, we consider the general lossy coding problem in which the reconstruction of the source at the receivers need not be perfect. We shall refer to this problem setup as Wyner–Ziv coding over broadcast channels (WZBC).

We present a coding scheme for this scenario and analyze its performance in the quadratic Gaussian and binary Hamming cases.

In telecommunications, dirty paper coding (DPC) is a technique for efficient transmission of digital data through a channel that is subject to some interference that is known to the transmitter. The technique consists of precoding the data so as to cancel the effect of the interference.telecommunicationsdigitaldatachannelinterferenceprecoding

This paper addresses lossy transmission of a common source over a broadcast channel when there is correlated side information at the receivers.

Introduction Background and notation A basic WZBC scheme A layered WZBC scheme Source coding rates for LDS PERFORMANCE ANALYSIS FOR THE QUADRATIC GAUSSIAN PROBLEM PERFORMANCE ANALYSIS FOR THE BINARY HAMMING PROBLEM CONCLUSIONS AND FUTURE WORK

be random variables denoting a source with independent and identically distributed (i.i.d.) realizations. Source X is to be transmitted over a memoryless broadcast channel defined by Decoder k has access to side information in addition to the channel output

single-letter distortion measures

Definition 1: An code consists of an encoder and decoders at each receiver The rate of the code is channel uses per source symbol

A distortion tuple is said to be achievable at a rational rate if for every, there exists such that for all integers with, there exists an code satisfying

In this paper, we present some general WZBC techniques and derive the corresponding achievable distortion regions. We study the performance of these techniques for the following cases. Quadratic Gaussian Binary Hamming

the case. Since,we shall drop the subscripts that relate to the receiver. The Wyner– Ziv rate-distortion performance is characterized as is an auxiliary random variable,and the capacity of the channel is well-known to be

It is then straightforward to conclude that combining separate source and channel codes yields the distortion On the other hand, a converse result in [15] shows that even by using joint source-channel codes, one cannot improve the distortion performance further than (3). We are further interested in the evaluation of,as well as in the test channels achieving it, for the quadratic Gaussian and binary Hamming cases.

It was shown in [22] that the optimal backward test channel is given by where and are independent Gaussians. For the rate we have The optimal reconstruction is a linear estimate

which yields the distortion and therefore

It was implicitly shown in [23] that the optimal auxiliary random variable is given by where are all independent, and are and with and, respectively, and is an erasure operator, i.e., This choice results in

Where with denoting the binary convolution, i.e.,, and denoting the binary entropy function, i.e.,

At each terminal, no WZBC scheme can achieve a distortion less than the minimum distortion achievable by ignoring the other terminals. Thus

there is considerable simplification in the quadratic Gaussian and binary Hamming cases since the channel and the side information are degraded in both cases: we can assume that one of the two Markov chains or holds (for arbitrary channel input ) for the channel or holds for the source.

Introduction Background and notation A basic WZBC scheme A layered WZBC scheme Source coding rates for LDS PERFORMANCE ANALYSIS FOR THE QUADRATIC GAUSSIAN PROBLEM PERFORMANCE ANALYSIS FOR THE BINARY HAMMING PROBLEM CONCLUSIONS AND FUTURE WORK

Introduction Background and notation A basic WZBC scheme A layered WZBC scheme Source coding rates for LDS PERFORMANCE ANALYSIS FOR THE QUADRATIC GAUSSIAN PROBLEM PERFORMANCE ANALYSIS FOR THE BINARY HAMMING PROBLEM CONCLUSIONS AND FUTURE WORK

the side information and channel characteristics at the two receiving terminals can be very different, we might be able to improve the performance by layered coding, i.e., by not only transmitting a common layer (CL) to both receivers but also additionally transmitting a refinement layer (RL) to one of the two receivers. Since there are two receivers, we are focusing on coding with only two layers because intuitively, more layers targeted for the same receiver can only degrade the performance.

CL : c RL : r In this scheme, illustrated in Fig. 2, the CL is coded using CDS with DPC with the RL codeword acting as CSI.

Introduction Background and notation A basic WZBC scheme A layered WZBC scheme Source coding rates for LDS PERFORMANCE ANALYSIS FOR THE QUADRATIC GAUSSIAN PROBLEM PERFORMANCE ANALYSIS FOR THE BINARY HAMMING PROBLEM CONCLUSIONS AND FUTURE WORK

Introduction Background and notation A basic WZBC scheme A layered WZBC scheme Source coding rates for LDS PERFORMANCE ANALYSIS FOR THE QUADRATIC GAUSSIAN PROBLEM PERFORMANCE ANALYSIS FOR THE BINARY HAMMING PROBLEM CONCLUSIONS AND FUTURE WORK

Introduction Background and notation A basic WZBC scheme A layered WZBC scheme Source coding rates for LDS PERFORMANCE ANALYSIS FOR THE QUADRATIC GAUSSIAN PROBLEM PERFORMANCE ANALYSIS FOR THE BINARY HAMMING PROBLEM CONCLUSIONS AND FUTURE WORK