Information theory Multi-user information theory A.J. Han Vinck Essen, 2004.

Slides:



Advertisements
Similar presentations
Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
Advertisements

1 LP Duality Lecture 13: Feb Min-Max Theorems In bipartite graph, Maximum matching = Minimum Vertex Cover In every graph, Maximum Flow = Minimum.
Information Theory EE322 Al-Sanie.
B IPARTITE I NDEX C ODING Arash Saber Tehrani Alexandros G. Dimakis Michael J. Neely Department of Electrical Engineering University of Southern California.
Capacity of Wireless Channels
Information Theory Introduction to Channel Coding Jalal Al Roumy.
Cooperative Multiple Input Multiple Output Communication in Wireless Sensor Network: An Error Correcting Code approach using LDPC Code Goutham Kumar Kandukuri.
Chapter 6 Information Theory
Achilleas Anastasopoulos (joint work with Lihua Weng and Sandeep Pradhan) April A Framework for Heterogeneous Quality-of-Service Guarantees in.
Poorvi Vora/CTO/IPG/HP 01/03 1 The channel coding theorem and the security of binary randomization Poorvi Vora Hewlett-Packard Co.
June 4, 2015 On the Capacity of a Class of Cognitive Radios Sriram Sridharan in collaboration with Dr. Sriram Vishwanath Wireless Networking and Communications.
Lihua Weng Dept. of EECS, Univ. of Michigan Error Exponent Regions for Multi-User Channels.
Threshold Phenomena and Fountain Codes
Data Broadcast in Asymmetric Wireless Environments Nitin H. Vaidya Sohail Hameed.
Reflexive -- First sentence of proof is: (1) Let x  Z (2) Let (x,x)  R. (3) Let (x,x)  I (4) Let x  R.
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Ger man Aerospace Center Gothenburg, April, 2007 Coding Schemes for Crisscross Error Patterns Simon Plass, Gerd Richter, and A.J. Han Vinck.
Noise, Information Theory, and Entropy
Noise, Information Theory, and Entropy
Some basic concepts of Information Theory and Entropy
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
DIGITAL COMMUNICATION Error - Correction A.J. Han Vinck.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION 2-dimensional transmission A.J. Han Vinck May 1, 2003.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany Data communication line codes and constrained sequences A.J. Han Vinck Revised.
Wireless Mobile Communication and Transmission Lab. Theory and Technology of Error Control Coding Chapter 7 Low Density Parity Check Codes.
EM and expected complete log-likelihood Mixture of Experts
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany On STORAGE Systems A.J. Han Vinck January 2011.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany On STORAGE Systems A.J. Han Vinck June 2004.
3.4 Linear Programming p Optimization - Finding the minimum or maximum value of some quantity. Linear programming is a form of optimization where.
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
Equations of Linear Relationships
Channel Capacity.
Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single.
User Cooperation via Rateless Coding Mahyar Shirvanimoghaddam, Yonghui Li, and Branka Vucetic The University of Sydney, Australia IEEE GLOBECOM 2012 &
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 12.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Information theory Multi-user information theory Part 7: A special matrix application A.J. Han Vinck Essen, 2002.
DIGITAL COMMUNICATIONS Linear Block Codes
Cryptography and Authentication A.J. Han Vinck Essen, 2008
Superposition encoding A distorted version of is is encoded into the inner codebook Receiver 2 decodes using received signal and its side information Decoding.
Beyond FOIL Alternate Methods for Multiplying and Factoring Polynomials.
Beyond FOIL Alternate Methods for Multiplying and Factoring Polynomials.
Ethernet. Ethernet (802.3) 1-persistent CSMA, CD, binary exponential backoff Carrier sense: station listens to channel first. 1-persistent: If idle, station.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Equations of Linear Relationships
Fidelity of a Quantum ARQ Protocol Alexei Ashikhmin Bell Labs  Classical Automatic Repeat Request (ARQ) Protocol  Quantum Automatic Repeat Request (ARQ)
Using Feedback in MANETs: a Control Perspective Todd P. Coleman University of Illinois DARPA ITMANET TexPoint fonts used.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3c: Signal Detection in AWGN.
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
Distributed Compression For Still Images
The Viterbi Decoding Algorithm
Introduction to Information theory
Coding and Algorithms for Memories Lecture 4
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Linear Programming.
1.6 Linear Programming Pg. 30.
Lihua Weng Dept. of EECS, Univ. of Michigan
Presentation transcript:

Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

content Some examples of channels Additive coding for the broadcasting Superposition coding for multi-access Coding for the two-way channel Coding for the switching channel Some more

Goal of the lectures: Introduction of some classical models two-way; two access; broadcast; Problems connected: calculation and formulation of capacity Development of coding strategies

Time Sharing (TDMA) User 1 User 2 User 3 Time sharing: easy to organize inefficient if not many users active efficiency depends on channel message idle Common channel

Two-way X1X2 Y1Y2 X1 and X2 communicate by observing Y1 and Y2 R1= I(X1;Y2|X2) R2= I(X2;Y1|X1) Maximize (R1,R2) over any input distribution P(X1,X2) I(X1;Y2|X2) := H(X1|X2)-H(X1|X2,Y2)

Note: I(X1;Y2|X2) := H(X1|X2)-H(X1|X2,Y2) H(X1|X2) = minimum average # bits needed to specify X1 given X2 H(X1|X2,Y2) = minimum average # bits needed to specify X1 given X2 and the observation Y2 Difference = what we learned from the transmission over the channel = the reduction in average specification length of X1|X2

Example: AND channel X1 X2 Y X1 01 X Y When X1 = 0, he does not know X2 X1 = 1, he knows X2 Same for X2

A coding example X X if y = 0 Transmit inverse (red) If y = 0 Inputs are known Rate: 1/( 2*3/4 + 1*¼) = 4/7 = 0.57 > 1 !! X1 X2 Y

Another coding example X X Rate: log 2 3/( 2*3/9 + 3 * 6/9 ) =.59 > 1 !! X1 X2 Y

dependent inputs X1 and X2 P(X1=0, X2=0) = 0 P(X1=0, X2=1) = P(X1=1, X2=0) = p P(X1=1,X2=1) = 1-2p Then, P(X1=1) = P(X1=1, X2=0) + P(X1=1, X2=1) = 1-p. R2 = R1 = I(X2;Y|X1) = I(X1;Y|X2) = H(Y|X1) = (1-p)h(p/(1-p)) The maximum = p 1 p 1-2p

Note: P(Y=0|X1=1) = P(Y=0, X1=1)/P(X1=1) = p/(1-p) P(Y=1|X1=1) = P(Y=1, X1=1)/P(X1=1) = (1-2p)/(1-p)

A lower bound Let X1 and X2 transmit independently P(X1 = 1) = 1 – P(X1=0) = a P(X2 = 1) = 1 – P(X2=0) = a Then: R1 = I(X1;Y|X2) = H(Y|X2) – H(Y|X1,X2) = ah(a) = R2 The maximum = > 4/7 X1 X2 Y

The upper (outer) bound inner The inner bound ( for independent transmission) is outer < the Shannon outer bound ( X1 and X2 dependent). For R1 = R2inner rate = outer rate  The exact capacity is unknown! X1 X2 Y

bounds R1 1 1 R2 Outer bound inner bound 0 X1 X2 Y

Broadcast X Z Y Z transmits information to X same information to Y R1  I( X; Z ) R2  I( Y; Z ) R1 + R2  I ( Z; (X,Y)) = I( Z; X)

broadcast X Z Y Z transmits information to X different information to Y R1  I( X; Z ) R2  I( Y; Z ) R1 + R2  I( Z; (X,Y) ) = I( Z; X) + I(Z;Y|X)

example: Blackwell BC ZXY ZXY R1  I( X; Z ) = H(X)-H(X|Z) R2  I( Y; Z ) =H(Y) –H(Y|Z) R1 + R2  I( Z; (X,Y))  log 2 3 X Z Y

example Y 00/11 01/ X Z Z X Y I(Y;Z) = 1 I(X;Z) = log 2 3 R sum = (1+ log 2 3)/2 = 1.29 bit/tr. X Z Y

2-access channel X1 X2 Y X1 and X2 want to communicate with Y at the same time! Obvious bound on the sum rate: R1+R2  H(Y) – H(Y|X1,X2)  H(Y)

Two-access models Switchingtwo-Adder x1 x2y x1 x2 y 00   Y Y X1 X2 X1 X2 y y  

Two-adder (1) Capacity region X X2 1 X1 X2 Y R1  1 from X1 to Y R2  1 from X2 to Y R1+R2  H(Y)  R2 R timesharing

Two-adder (2) Coding strategy:  -error User 1: transmit at rate R = 1 bit, i.e. P(X1 = 0) = ½ User 2: sees erasure channel. 0 User Max H(X)-H(X|Y) = ½ Hence: rate pair (R1,R2) = ( 1, ½ ) X1 X2 Y

Two-adder (3) A simple a-symmetric strategy: 0-error X1 X Efficiency = 1/3 log 2 14 = 1.27 X1 X2 Y

Two-adder with feedback (1) Question: can we enlarge the capacity region? Yes (Wolf)! Is this a surprise? R R2 Capacity region characterization: modified by Cover-Leung Willems X1 X2 Y

Two-adder with feedback (2) X1 X2 N independent transmissions uncertainty Solve uncertainty insteps Total efficiency: = 1.52! Joint output selection X1 X2 Y

Switching channel (1) X101 X20  0 1  1 Y X2={0,1} X1={0,1} { ,0,1} tri-state logic P(  \ pass info ) = ( 1-a, a ) R sum (max) = a + h(a)  log 2 3

simple coding example (2) For n = 2: code 1: code     1 Sum rate: R sum = (log )/2 = 1.3 X2={0,1} X1={0,1} { ,0,1}

Switching channel with general coding (3) Strategy: code 1 ( ) < d min zeros Linear code 2 ( ) k  n – d min + 1 receive: (  1 ...  0  ) correct d min - 1 erasures = C!! PERFORMANCE: = C!! n

Extensions (2) T-user ADDER + {0,1} {0,1,, T}

T-user ADDER + (1) Input output User 1{ 00 11} { } User 2 { 10 01} User 3 { 00 10} Efficiency: 3 * ½ = 1.5

T-user ADDER + (2) Input User 1 { } User 2{ } User 3{ } Outputs: { etc. } Efficiency = ( 2 + log 2 6 )/3 = 1.53 bits/tr. record: by van Tilborg (1991)