The Capacity of Interference Channels with Partial Transmitter Cooperation Ivana Marić Roy D. Yates Gerhard Kramer Stanford WINLAB, Rutgers Bell Labs Ivana.

Slides:



Advertisements
Similar presentations
Cognitive Radio Communications and Networks: Principles and Practice By A. M. Wyglinski, M. Nekovee, Y. T. Hou (Elsevier, December 2009) 1 Chapter 11 Information.
Advertisements

Derives the optimal achievable rate for MISO secondary users under coexistence constraints Proposes practical strategy for cognition and cooperation in.
Information theory Multi-user information theory A.J. Han Vinck Essen, 2004.
VSMC MIMO: A Spectral Efficient Scheme for Cooperative Relay in Cognitive Radio Networks 1.
Relaying in networks with multiple sources has new aspects: 1. Relaying messages to one destination increases interference to others 2. Relays can jointly.
Information Theory EE322 Al-Sanie.
Capacity of Wireless Channels
Enhancing Secrecy With Channel Knowledge
Information Theoretical Security and Secure Network Coding NCIS11 Ning Cai May 14, 2011 Xidian University.
Chain Rules for Entropy
EE360: Lecture 13 Outline Cognitive Radios and their Capacity Announcements March 5 lecture moved to March 7, 12-1:15pm, Packard 364 Poster session scheduling.
Chapter 6 Information Theory
Achilleas Anastasopoulos (joint work with Lihua Weng and Sandeep Pradhan) April A Framework for Heterogeneous Quality-of-Service Guarantees in.
June 4, 2015 On the Capacity of a Class of Cognitive Radios Sriram Sridharan in collaboration with Dr. Sriram Vishwanath Wireless Networking and Communications.
Node Cooperation and Cognition in Dynamic Wireless Networks
Lihua Weng Dept. of EECS, Univ. of Michigan Error Exponent Regions for Multi-User Channels.
Cooperation and Crosslayer Design in Wireless Networks Andrea Goldsmith Stanford University DAWN ARO MURI Program Review U.C. Santa Cruz September 12,
Ergodic Capacity of MIMO Relay Channel Bo Wang and Junshan Zhang Dept. of Electrical Engineering Arizona State University Anders Host-Madsen Dept. of Electrical.
1 Network Source Coding Lee Center Workshop 2006 Wei-Hsin Gu (EE, with Prof. Effros)
Collaborative Wireless Networks Computer Laboratory Digital Technology Group Wireless Communications Today Wireless communications today has evolved into.
EE360: Lecture 6 Outline MAC Channel Capacity in AWGN
A Graph-based Framework for Transmission of Correlated Sources over Multiuser Channels Suhan Choi May 2006.
EE360: Multiuser Wireless Systems and Networks Lecture 3 Outline Announcements l Makeup lecture Feb 2, 5-6:15. l Presentation schedule will be sent out.
Capacity Limits of Wireless Channels with Multiple Antennas: Challenges, Insights, and New Mathematical Methods Andrea Goldsmith Stanford University CoAuthors:
Gaussian Interference Channel Capacity to Within One Bit David Tse Wireless Foundations U.C. Berkeley MIT LIDS May 4, 2007 Joint work with Raul Etkin (HP)
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Noise, Information Theory, and Entropy
EE360: Lecture 13 Outline Cognitive Radios and their Capacity Announcements Progress reports due today, HW 2 posted March 5 lecture moved to March 7, 12-1:15pm,
Coding Schemes for Multiple-Relay Channels 1 Ph.D. Defense Department of Electrical and Computer Engineering University of Waterloo Xiugang Wu December.
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Communication over Bidirectional Links A. Khoshnevis, D. Dash, C Steger, A. Sabharwal TAP/WARP retreat May 11, 2006.
When rate of interferer’s codebook small Does not place burden for destination to decode interference When rate of interferer’s codebook large Treating.
Joint Physical Layer Coding and Network Coding for Bi-Directional Relaying Makesh Wilson, Krishna Narayanan, Henry Pfister and Alex Sprintson Department.
Channel Capacity
MD-based scheme could outperform MR-based scheme while preserving the source- channel interface Rate is not sufficient as source- channel interface, ordering.
Channel Capacity.
User Cooperation via Rateless Coding Mahyar Shirvanimoghaddam, Yonghui Li, and Branka Vucetic The University of Sydney, Australia IEEE GLOBECOM 2012 &
1 Codage avec Information Adjacante (DPC : Dirty paper coding) et certaines de ses applications : Tatouage (Watermarking) MIMO broadcast channels Gholam-Reza.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Superposition encoding A distorted version of is is encoded into the inner codebook Receiver 2 decodes using received signal and its side information Decoding.
EE360: Lecture 9 Outline Announcements Cooperation in Ad Hoc Networks
Interference in MANETs: Friend or Foe? Andrea Goldsmith
We aim to exploit cognition to maximize network performance What is the side information at a cognitive node? What is the best encoding scheme given this.
Information Theory for Mobile Ad-Hoc Networks (ITMANET) Thrust I Michelle Effros, Ralf Koetter, & Muriel Médard (and everyone!)
Information Theory for Mobile Ad-Hoc Networks (ITMANET): The FLoWS Project Thrusts 0 and 1 Metrics and Upper Bounds Muriel Medard, Michelle Effros and.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Rate Bounds for MIMO Relay Channels Using Precoding Caleb K. Lo, Sriram Vishwanath and Robert W. Heath, Jr. Wireless Networking and Communications Group.
The Finite-state channel was introduced as early as 1953 [McMillan'53]. Memory captured by channel state at end of previous symbol's transmission: - S.
October 28, 2005 Single User Wireless Scheduling Policies: Opportunism and Optimality Brian Smith and Sriram Vishwanath University of Texas at Austin October.
Scheduling Considerations for Multi-User MIMO
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Chance Constrained Robust Energy Efficiency in Cognitive Radio Networks with Channel Uncertainty Yongjun Xu and Xiaohui Zhao College of Communication Engineering,
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
ON AVCS WITH QUADRATIC CONSTRAINTS Farzin Haddadpour Joint work with Madhi Jafari Siavoshani, Mayank Bakshi and Sidharth Jaggi Sharif University of Technology,
March 18, 2005 Network Coding in Interference Networks Brian Smith and Sriram Vishwanath University of Texas at Austin March 18 th, 2005 Conference on.
EE360: Lecture 13 Outline Capacity of Cognitive Radios Announcements Progress reports due Feb. 29 at midnight Overview Achievable rates in Cognitive Radios.
Distributed Joint Source-Channel Coding on a Multiple Access Channel with Side Information Vinod Sharma.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
Joint Source, Channel, and Network Coding in MANETs
Ivana Marić, Ron Dabora and Andrea Goldsmith
Howard Huang, Sivarama Venkatesan, and Harish Viswanathan
Independent Encoding for the Broadcast Channel
Layered Decoding and Secrecy Over Degraded Broadcast Channel
Introduction Results Proofs Summary
Jinhua Jiang, Ivana Marić, Andrea Goldsmith, and Shuguang Cui Summary
Compute-and-Forward Can Buy Secrecy Cheap
Lihua Weng Dept. of EECS, Univ. of Michigan
Presentation transcript:

The Capacity of Interference Channels with Partial Transmitter Cooperation Ivana Marić Roy D. Yates Gerhard Kramer Stanford WINLAB, Rutgers Bell Labs Ivana Marić Roy D. Yates Gerhard Kramer Stanford WINLAB, Rutgers Bell Labs Sender 1 Sender 2 Decoder 1 Decoder 2

2 Gaussian Channel in Standard FormGaussian Channel in Standard Form X1X1 √h 21 √h 12 X2X2 Y1Y1 Y2Y2 1 1 channel encoder 1 encoder 2 X1X1 X2X2 W1W1 W2W2 W1W1 ^ decoder 1 decoder 2 Y1Y1 Y2Y2 W2W2 ^ p(y 1,y 2 |x 1,x 2 ) Interference Channel Z 1 ~ N (0,1), Z 2 ~ N (0,1 ) where How to cope with interference is not well understoodHow to cope with interference is not well understood The interference channel capacity is a long standing open problemThe interference channel capacity is a long standing open problem

3 Strong Interference Capacity region known if there is strong interference: Capacity region = bothCapacity region = Capacity region of a compound MAC in which both messages are decoded at both receivers [Ahlswede, 1974] For the Gaussian channel in standard form, this means [Sato, Han&Kobayashi, 1981] for all input product probability distributions [Costa&El Gamal,1987] Focus of this work: strong interference

4 Compound MAC Source 1 Source 2 Decoder 1 Decoder 2 W1W1 W2W2 (W 1, W 2 )

5 Cooperation in Interference Channel Not captured in the interference channel model: Broadcasting: sources “overhear” each other’s transmission Cooperation Our goal: consider cooperation and derive capacity results Work on more involved problems than already unsolved? What are insightful and tractable channel models? How do we model cooperation? channel encoder 1 encoder 2 X1X1 X2X2 W1W1 W2W2 W1W1 ^ decoder 1 decoder 2 Y1Y1 Y2Y2 W2W2 ^ p(y 1,y 2 |x 1,x 2 )

6 Full cooperation: MIMO broadcast channel DPC optimal [Weingarten, Steinberg & Shamai, 04], [Caire & Shamai, 01], [Viswanath, Jindal & Goldsmith, 02] Several cooperation strategies proposed [Host-Madsen, Jindal, Mitra& Goldsmith, 03, Ng & Goldsmith, 04] [Jindal, Mitra& Goldsmith, Ng& Goldsmith]: Gain from DPC when sources close together When apart, relaying outperforms DPC Assumptions: Dedicated orthogonal cooperation channels Total power constraint Transmitter Cooperation for Gaussian Channels S1S1 S2S2 R1R1 R2R2 S1S1 S2S2 R1R1 R2R2

7 Transmitter Cooperation In Discrete Memoryless Channel MAC with partially cooperating encoders [Willems, 1983] Capacity region C MAC (C 12,C 21 ) determined by Willems, 1983 Communication through conference Realistic model C 21 channel encoder 1 encoder 2 decoder X1X1 X2X2 W1W1 W2W2 Y (W 1, ^ W 2 ) ^ C 12

8 Each transmitter sends K symbols V 1k depends on previously received Alphabet size of is at most NC 12 Alphabet size of is at most NC 21 Cooperation through Conference

9 Partial Transmitter Cooperation channel encoder 1 encoder 2 decoder 1 decoder 2 X1X1 X2X2 W 1,W 0 W 2, W 0 Y1Y1 Y2Y2 In the conference: partial information about messages W 1, W 2 exchanged between encoders After the conference: Common message at the encoders Encompasses scenarios in which encoders have a partial knowledge about each other’s messages

10 Transmitter Cooperation Limited cooperation allows transmitters to exchange partial information about each other messages After cooperation Common message known to both encoders Private message at each encoder In the strong interference, the capacity region tied to the capacity region of The Compound MAC with Common Information Two MAC channels with a common and private messages common private channel encoder 1 encoder 2 decoder 1 decoder 2 X1X1 X2X2 W1W1 W2W2 Y1Y1 Y2Y2 W0W0 (W 1,W 2, W 0 ) ^ ^ ^ (W 1,W 2, W 0 ) ^ ^ ^ p(y 1,y 2 |x 1,x 2 )

11 Compound MAC with Common Information Source 1 Source 2 Decoder 1 Decoder 2 W 1,W 0 W 2,W 0 (W 1, W 2,W 0 )

12 The Compound MAC with Common Information Encoding Decoding The error probability (R 0, R 1, R 2 ) achievable if, for any, there is an (M 0, M 1, M 2, N, P e ) code such that The capacity region is the closure of the set of all achievable (R 0, R 1, R 2 ) Easy to determine given the result by [Slepian & Wolf, 1973]

13 Capacity region [ Slepian & Wolf, 1973] union over p(u)p(x 1 |u)p(x 2 |u)p(y|x 1,x 2 ) common private channel encoder 1 encoder 2 decoder X1X1 X2X2 W1W1 W2W2 Y W0W0 (W 1,W 2,W 0 ) ^ ^ ^ p(y|x 1,x 2 ) The MAC with Common Information

14 Capacity region [ Slepian & Wolf, 1973] union over p(u)p(x 1 |u)p(x 2 |u)p(y|x 1,x 2 ) common private channel encoder 1 encoder 2 decoder X1X1 X2X2 W1W1 W2W2 Y W0W0 (W 1,W 2,W 0 ) ^ ^ ^ p(y|x 1,x 2 ) The MAC with Common Information

15 The Compound MAC with Common Information union over p(u)p(x 1 |u)p(x 2 |u)p(y 1,y 2 |x 1,x 2 ) Capacity region [MYK, ISIT 2005] common private channel encoder 1 encoder 2 decoder 1 decoder 2 X1X1 X2X2 W1W1 W2W2 Y1Y1 Y2Y2 W0W0 (W 1,W 2, W 0 ) ^ ^ ^ (W 1,W 2, W 0 ) ^ ^ ^ p(y 1,y 2 |x 1,x 2 )

16 union over p(u)p(x 1 |u)p(x 2 |u)p(y 1,y 2 |x 1,x 2 ) For each p : ( R 0,R 1,R 2 ) is an intersection of rate regions R MACt achieved in two MACs with common information: The Compound MAC with Common Information Capacity region

17 Converse Error probability in MAC t Error probability in CMAC  → Necessary condition for : → Rates confined to R MAC1 (p) and R MAC2 (p) for every p

18 From Slepian and Wolf result, choosing the rates ( R 0,R 1,R 2 ) in will guarantee that P e1 and P e2 can be made arbitrarily small → P e will be arbitrarily small Achievability The probability of error

19 Implications We can use this result to determine the capacity region of several channel with partial transmitter cooperation: 1. 1.The Strong Interference Channel with Common Information 2. 2.The Strong Interference Channel with Unidirectional Cooperation 3. 3.The Compound MAC with Conferencing Encoders

20 Compound MAC with common information After The Conference common private channel encoder 1 encoder 2 decoder 1 decoder 2 X1X1 X2X2 W1W1 W2W2 Y1Y1 Y2Y2 W0W0 (W 1,W 0, ) ^ ^ Relax the decoding constraints: Each receiver decodes only one private message W2W2 ^ (W 2,W 0, ) ^ ^ W1W1 ^

21 Theorem For the interference channel with common information satisfying for all product input distributions the capacity region C is union over p(u)p(x 1 |u)p(x 2 |u)p(y 1,y 2 |x 1,x 2 ) Capacity region = capacity region of the compound MAC withCapacity region = capacity region of the compound MAC with common information common information

22 Proof Achievability Follows directly from the achievability in the Compound MAC with common information – –Decoding constraint is relaxed Converse Using Fano’s inequality In the interference channel with no cooperation: outer bounds rely on the independence of X 1 and X 2 Due to cooperation: Codewords not independent Theorem conditions obtained from the converse

23 Relationship to the Strong Interference Conditions  for all input distributions for all product input distributions The same interference channel class satisfies two sets of conditions Strong interference channels conditions

24 Interference Channel With Unidirectional Cooperation channel encoder 1 encoder 2 decoder 1 decoder 2 X1X1 X2X2 W1W1 W2W2 Y1Y1 Y2Y2 W1W1 ^ W2W2 ^ Encoding functions Decoding functions The error probability for The difference from the interference channel: one encoder knows both messages

25 Cognitive Radio Settings Cognitive Radio Channel [Devroye, Mitran, Tarokh, 2005] An achievable rate region Consider simple two-transmitter, two-receiver network: Assume that one transmitter is cognitive It can “overhear” transmission of the primary user It obtains partially the primary user’s message  it can cooperate Secondary user Primary user Decoder 1 Decoder 2 (cognitive radio)  Interference channel with unidirectional cooperation The assumption that the full message W 1 is available at the cognitive user is an over-idealization

26 Interference Channel with Unidirectional Cooperation The Interference Channel with Unidirectional Cooperation [Marić, Yates & Kramer, 2005] Capacity in very strong interference The Interference Channel with Degraded Message Set [Wu, Vishwanath & Arapostathis, 2006] Capacity for weak interference and for Gaussian channel in weak interference Cognitive Radio Channel [Joviċić& Viswanath, 2006] Capacity for Gaussian channel in weak interference The Interference Channel with a Cognitive Transmitter [Marić, Goldsmith, Kramer& Shamai, 2006] New outer bounds and an achievable region

27 For the interference channel with unidirectional cooperation satisfying for all joint input distributions p(x 1,x 2 ), the capacity region C is Capacity region = capacity region of the Compound MAC channel with Common Information Theorem where the union is over p(x 1,x 2 )p(y 1,y 2 |x 1,x 2 ).

28 Compound MAC with Common Information Achievability: Compound MAC with Common Information Encode as for a Compound MAC with Common Information Due to unidirectional cooperation: W 1 is a common message encoder 1 has no private message  R’ 0 =R 1, R’ 1 =0 choose: U=X 1 common private channel encoder 1 encoder 2 decoder 1 decoder 2 X1X1 X2X2 Y1Y1 Y2Y2 W1W1 W2W2 (W 1,W 2 ) ^ ^ ^ ^

29 Converse Using Fano’s inequality Interference channel with no cooperation: outer bounds rely on the independence of X 1 and X 2 Due to cooperation: Codewords not independent Theorem conditions obtained from the converse

30 Converse ( Continued ) Lemma: If per-letter conditions are satisfied for all p(x 1,x 2 ), then Proof: Similar to proof by [Costa& El Gamal,1987] with the changes X 1, X 2 not independent Conditioning on W 1 We would next like But because the situation is asymmetric, this seems difficult  21 RRN  Bound Requires

31 Converse ( Continued ) Recall that the achievable rates are By assumption, for all p(x 1,x 2 ) The rates union over all p(x 1,x 2 ) are thus achievable and are an outer bound

32 Gaussian Channel Channel outputs: Power constraints: Z 1 ~ N (0,1) Z 2 ~ N (0,1 ) Noise: S1 S2 x1x1 √h 21 √h 12 x2x2 y1y1 y2y2 1 1

33 Encoder 2: - Dedicates a portion  P 2 to help -Uses superposition coding Capacity Achieving Scheme in Strong Interference Encoder 1: Codebook x 1 (w 1 ) Decoders reduce interference as they can decode each other’s messages dec 1 dec 2 enc 1 enc 2 √h 21 √h 12

34 Interference channel – no cooperation: Interference channel with common information: Unidirectional cooperation: - more demanding conditions For P 1 = P 2 sufficient conditions: For P 1 = 0 conditions never satisfied Channel reduces to a degraded broadcast channel from sender 2 Gaussian Channel- Strong Interference Conditions S1 S2 x1x1 √h 21 √h 12 x2x2 y1y1 y2y2 1 1

35 Gaussian Channel with Unidirectional Cooperation h 12 1 weak strong interference Let h 21 1 Sufficient conditions: Weak and strong interference regime solved Our current work: in between regime s1s1 s2s2 d2d2 d1d1 Strong interference scenario:

36 Gaussian Channel With Conferencing Channel outputs: Power constraints: Z 1 ~ N (0,1) Z 2 ~ N (0,1 ) Noise: S1 S2 Z1Z1 x1x1 √h Z2Z2 √h 21 x2x2 y1y1 y2y2 1 1 Note: additional resources needed for the conference C 12 C 21

37 a = 0 max sum rate c=0 (no cooperation) Gaussian Channel With Conferencing a increases Symmetric channel a=0: X 1, X 2 independent Max sum-rate: Two sum-rate bounds equal h 12 = h 21 = h P 1 = P 2 = P c = c 12 = c 21 Z t ~ N (0,1) t=1,2

38 a = 0 max sum rate c=0 (no cooperation) Gaussian Channel With Conferencing Sender t power, t=1,2 : P c -conference power a increases Symmetric channel

39 c:Large c: Rate region → → Maximized for a=1 Senders able to exchange indexes W 1 and W 2 → → Full cooperation condition → → Less cooperation needed as the receivers further away Full Cooperation in Gaussian Channel

40 Interference Channel With Conferencing Relax the constraint → Each user decodes only one message: Strong interference conditions? After the conferenceAfter the conference common private channel encoder 1 encoder 2 decoder 1 decoder 2 X1X1 X2X2 W1W1 W2W2 Y1Y1 Y2Y2 W0W0 (W 1,W 01 ) ^ ^ (W 2,W 02 ) ^ ^ Common message contains information about both original messagesCommon message contains information about both original messages W 0 =(W 01,W 02 ) Decoders are interested in a part of the common message Strong interference conditions?

41 Encompasses various models: MIMO broadcast Unidirectional cooperation Partial cooperation Strong Interference conditions? Interference Channel With Conferencing (W 1,W 01 ) ^ ^ channel encoder 1 encoder 2 decoder 1 decoder 2 X1X1 X2X2 W 1 W 01 W 02 Y1Y1 Y2Y2 W 2 W 01 W 02 (W 2,W 02 ) ^ ^

42 Discussion Introduced cooperation into the Interference Channel: Capacity results for scenarios of strong interference If the interference is strong : decoders can decode the interference No partial decoding at the receivers → easier problem Ongoing work: Channel models that incorporate node cooperation – –Capture characteristics of cognitive radio networks Capacity results and cooperation gains for more general settings

43 Ongoing Work Determining the fundamental limits of wireless networks requires understanding the optimal ways of cooperation Hierarchy of cooperation Encoding schemes Optimum resource allocation How to cope with interference

44 Not neededNot needed

45 a = 0 max sum rate a increases A B The Gaussian Channel Capacity Region c+ R 1 Corner A: S1 acts as a relay: Sends at rate c S2 sends independent information at rate Corner B: In addition, S1 sends own data Sum-rate for a =0 achieved c S2 S1 c=0 (no cooperation)

46 Interference Channel with Confidential Messages channel encoder 1 encoder 2 X1X1 X2X2 W1W1 W2W2 W1W1 ^ decoder 1 decoder 2 Y1Y1 Y2Y2 W2W2 ^ p(y 1,y 2 |x 1,x 2 ) Interference Channel W1W1 W2W2

47 Interference Channel with Confidential Messages Joint work with: Ruoheng Liu, Predrag Spasojevic and Roy D. Yates Developed inner and outer bounds Receiver 1 Transmitter 1 encoder 1 encoder 2 decoder X1X1 X2X2 W1W1 W2W2 W1W1 ^ channel p(y 1, y 2 |x 1, x 2 ) Y1Y1 Transmitter 2 H(W 2 | Y 1,W 1 ) Receiver 2 decoder W2W2 ^ H(W 1 | Y 2,W 2 ) Y2Y2

48 The Compound MAC With Conferencing Encoders channel encoder 1 encoder 2 X1X1 X2X2 W1W1 W2W2 C 21 C 12 decoder 1 decoder 2 Y1Y1 Y2Y2 (W 1,W 2 ) ^ ^ ^ ^ p(y 1,y 2 |x 1,x 2 ) Adopt Willems’ cooperation model Decoding constraint: Both messages decoded by both receivers Encoding: Decoding: The probability of error:

49 Theorem capacity regionThe Compound MAC capacity region C (C 12,C 21 ) is union over p(u)p(x 1 |u)p(x 2 |u)p(y 1,y 2 |x 1,x 2 ) denoted p For each p : Intersection between rate regions of two MACs with partially cooperating encoders R MAC1 (p,C 12,C 21 ) R MAC2 (p,C 12,C 21 )

50 Gaussian Channel With Conferencing Union over all Use the maximum entropy theorem The Gaussian C-MAC capacity region For C 12 =C 21 =0: optimum choice is a=0, b=0