On Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti
|
|
- Delphia Watson
- 6 years ago
- Views:
Transcription
1 DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 1 On Two-user Fading Gaussian Broadcast Channels with Perfect Channel State Information at the Receivers Daniela Tuninetti Mobile Communication Laboratory EPFL, CH-1015 Lausanne, Switzerland daniela.tuninetti@epfl.ch Joint work with Prof. Shlomo Shamai (Technion - Israel)
2 DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 2 Motivation Fading Gaussian Broadcast channels (BS) model the down-link of wireless cellular systems with delay constraints much larger than the fading coherence time and no feedback available. The BC problem was formalized by Thomas Cover in Most of the available results on general BCs date back to the 70s. For the subsequent two decades few relevant works appeared. The BC problem regained attention recently in the unfaded multi-antenna Gaussian case (Caire-Shamai 2000).
3 DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 3 The broadcast problem (Cover 1972) A general discrete memoryless two-user Broadcast channel without feedback consists of an input X alphabet, two output Y alphabets and Z, and a transition P probability YZjX; A ((2 nr 0 ; 2 nr y ; 2 nr z );n;p (n) e )-code consists of - three message W sets = [2 nr u ] u u = for f0;y;zg, - one W encoder W y W z! X n 0 ; - two Y decoders! W n W y 0 Z and! W n W z 0, and probability of error, (w with ;w y ;w z ) 0 uniformly distributed, (n) e = Pr[ bw 0 6= w 0 or bw y 6= w y or bw z 6= w z ]; P The capacity region is the convex closure of rates 0 ;R y ;R z ) 2 R 3 + for which there exists a sequence of (R nr 0 ; 2 nr y ; 2 nr z );n;p (n) e )-codes with P (n) e! 0. ((2
4 P (n) e P Y jx = X n Y n Z n P YZjX and P ZjX = P YZjX DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 4 DECODER Y ( c W 0 ; c Wy) (W 0 ; Wy; Wz) ENCODER P Y ZjX DECODER Z ( c W 0 ; c Wz)! 0 depends only on the (conditional) marginals Z Z z y The capacity region of a general BC is NOT known. If channel P Y jx is stronger (classification by Korner-Marton 1975) than P channel ZjX then...
5 V ο PV P XjV P Y jx P ZjX Y ο PY Z ο PZ DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 5 X The channel is said to be: = P ZjX, degraded: 9 e PY jz : P y ep Y jzp Y jx less noisy: I(V ; Z)» I(V ; Y ) for all P VX, more capable: I(X; Z)» I(X; Y ) for all P X Clearly fdegradedg ρ fless noisyg ρ fmore capableg ρ fgeneral BCg. (deg.:bergmans 1973, Gallager 1974), (l.n.: Korner-Marton 1975) (m.c.: A.El Gamal 1979) (general:???).
6 C (mc) = DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 6 Capacity region for more capable BCs (A.El Gamal 1979) If I(X; Z)» I(X; Y ) for all P X then 8 < R 0 + R y + R z» minfi(x; Y ) ; I(V ; Z) + I(X; Y jv )g R 0 + R z» I(V ; Z) where V! X! (Y;Z) and jvj» jx j + 2. : NB. The capacity is also known in the following cases: the channel has one deterministic component (Marton 1979, Gelanfd-Pinsker 1980) and R y = 0 or R z = 0, the so called degraded message set case, (Korner-Marton 1977). sum and product of reversely degraded BCs (A.El Gamal 1980).
7 I = >< >: R 0 + R y» I(WU; Y ) R 0 + R z» I(WV; Z) minfi(w ; Y );I(W ; Z)g+ DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 7 The best inner bound (Marton 1979) For a general two-user BC, the best inner bound is: 8 R 0 + R y + R z» ; Y jw ) + I(V ; ZjW ) I(U ; V jw ) +I(U (W;U;V)! X! (Y;Z) where. No bounds on the cardinality W; U; V of...
8 O y = 8 >< >: R y» I(X; Y ) DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 8 The best outer bound (Korner-Marton 1979) For a general two-user BC, the best outer bound is O = O y O z where (exchange the role of the users to obtain O z ) R z» I(V ; Z) y + R z» I(X; Y jv ) + I(V ; Z) R V! X! (Y;Z) where jvj» jx j + and 2.
9 C (deg) = C (mc) I y R y» I(X; Y jv ) = IjX=U; W=V DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 9 Remarks: If the channel is degraded, or less noisy, the capacity is 8 < : R 0 + R z» I(V ; Z) where V! X! (Y;Z) and jvj» jx j (Gallager 1974). In general C (deg) in neither an inner bound nor an outer bound! On the contrary, C (mc) is ALWAYS an inner bound, i.e., We shall refer to I y as the more-capable-like inner bound.
10 Y = AX + N y DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 10 AWGN BCs with RX-CSI only We consider complex-valued fading Gaussian BCs 8 < : Z = B X + N z - A and B: ergodic processes known at the receivers only, - N u ο N c (0;N): Gaussian noise at receiver u 2 fy; zg and - X: subject to the power constraint E[jXj 2 ]» P. The output of channel-y (resp. Z) is the pair (Y;A) (resp. (Z; B)). (W;U;V;X) are independent of A and B. In general, this channel is not more capable.
11 DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 11 Example: jaj 2 2 f2; 16g and jbj 2 2 f4; 8g equiprobable C [bit/dim] Y: Gaussian input 0.2 Z: Gaussian input Y: Binary input Z: Binary input SNR=P/N0 [linear]
12 4 V X I y = 8 >< R z» C z (P ) C z (ffp ) DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 12 Remarks: Capacity for A and B constant (unfaded case) is achieved by 3 0 0;P Nc ο 5 1 ff p 3 1 ff p 4 ff 1 1 A for ff 2 [0; 1] The more-capable-like inner bound region (with R 0 = 0) computed for the above jointly Gaussian distribution reduces to [ R y + R z» C y (P ) ff2[0;1] >: R y + R z» C z (P ) + C y (ffp ) C z (ffp ) where C y (P ) = E log(1 + jaj 2 P=N ) Λ is the single user capacity which is achieved by X ο N c (0;P).
13 DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 13 Certain authors claim (GLOBECOM 2000) that if C y (P ) C z (P ) then the degraded region, computed for jointly Gaussian input, is an inner bound for the general fading case! Now, with Gaussian input, stripping at the receiver with largest single user capacity is possible if y (P ) C z (P ) C y (ffp ) C z (ffp ) 8ff 2 [0; 1] C in other words, if d(ff) = C y (ffp ) C z (ffp ) as a function of ff 2 [0; 1] achieves its absolute maximum for ff = 1.
14 DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 14 Example: jaj 2 2 f2; 16g and jbj 2 2 f4; 8g equiprobable Cy Cz [bit/dim] E[ A 2 ] = 9 and E[ B 2 ] = 6 E[log 2 A 2 ] = E[log 2 B 2 ] = 5/ SNR=P/N0 [linear]
15 DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 15 Example: jaj 2 2 f2; 16g and jbj 2 2 f4; 8g equiprobable R z [bit/dim] R y +R z < C y (P/N) Degraded like region R y [bit/dim]; P/N=0.5
16 h = H(XjV )» H(X)» log(ßevar[x])» log(ßep ) = g(p ) DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 16 How far are we from capacity? The outer bound is of use if the union over ALL P VX is taken... IDEA (our converse!): [ [ [ = since PVX:E[jXj 2 ]»P h:h»g(p ) PVX:E[jXj 2 ]»P ; H(XjV )=h
17 power entropy ineq: e H(BXjV B) + e H(N zjv B) log PV PXjV PVXB=PB g = E H(Nz) e B[H(b XjV;B=b)] + e e E B[log(jBj 2 )]H(XjV )» jbj 2 eh(xjv ) g + N DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 17 A lower bound to the (conditional) entropy Similarly to Bergman s proof (1974): H(ZjV B) = log ψ! eg(n) + ße ße ineq: Jensen EB ße
18 H(ZjV B) c ( ; ) max: H N EB;V» PV PXjV PVXB=PB EB;V = h N + jbj 2 E[ff 2XjV (v)] i g DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 18 An upper bound to the (conditional) entropy h ff 2ZjV B (v; b) i g h N + jbj 2 ff 2XjV (v) i g ineq: Jensen EB» where ff 2 XjV (v) is the variance of the random variable XjV = v, i.e., Z fi fi 2 fi fi fi ff 2 XjV (v) = Z P XjV (xjv)jxj 2 fi fi P XjV (xjv)x fi The above chain holds with = iif XjV = v ο N c ( ;g 1 H(XjV ) ).
19 » EB» EA PVX:E[jXj 2 ]»P ;H(XjV )=h g N + jbj 2 P Λ EB» N + jaj 2 eh(xjv ) g» N + jbj 2 eh(xjv ) g g(n ) DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 19 Everything can be expressed in terms of H(XjV ): I(Z; V jb) = H(ZjB) H(ZjV B) ße I(X; Y jv A) = H(Y jv A) H(Y jxa) Hence, for fading AWGN BCs with RX-CSI only ße [ [ = PVX:XοN c (0;P ) ;XjV οn c ( ;g 1 (h)) [ [ = h:h»g(p ) ff2[0;1]:h=g(ffp )
20 O y = I y = 8 >< 8 >< R z» C z (P ) C z (ffp ) R z» C z (P ) C z (ffp ) DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 20 We get [ R y» C y (P ) ff2[0;1] >: and recall that R y + R z» C z (P ) + C y (ffp ) C z (ffp ) [ R y + R z» C y (P ) ff2[0;1] >: it follows that... R y + R z» C z (P ) + C y (ffp ) C z (ffp )
21 DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 21 Main results Theorem 1. The degraded region computed for Gaussian input is the capacity region for fading AWGN BCs with RX-CSI for which the condition for successive cancellation is verified, i.e., = C y (ffp) C z (ffp) has absolute maximum for ff = 1. d(ff) Theorem 2. The more-capable region computed for Gaussian input is the capacity region for fading AWGN BCs with RX-CSI for d(ff) = C which (ffp) C z (ffp) 0 y for ff 2 [0; all 1]. Theorem 3. If the fading AWGN BCs with RX-CSI BC is more-capable then Gaussian input is optimal.
22 DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 22 For fading AWGN BCs with RX-CSI, Theorem 2 comprises channels that are NOT more-capable, i.e., for which no coding theorem exists! 1.5 A 2 {2,16} equiprobable B 2 {4, 8} equiprobable P/N = 0.5 R z [bit/dim] R y [bit/dim] We are working the case d(ff) is not non-negative in ff 2 [0; 1] and has not its absolute maximum for ff = 1 (62 Th:1 [ Th:2)
23 DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 23 Conclusions We consider a general fading AWGN BC with RX-CSI: 1. We first derive an inner bound region which is valid for any fading distribution and has the flavor of the more capable capacity region. 2. From the inner bound with Gaussian inputs: the best receiver (largest single user capacity) must decode jointly the two messages and successive cancellation might incur inherent degradation. 3. Then, by using the entropy power inequality, we show that Gaussian inputs exhaust Korner-Marton s outer bound region. 4. We conclude by stating conditions under which inner and outer bound meet, yielding thus the capacity region. 5. Our capacity result holds for channels that neither degraded NOR more capable.
Variable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More informationThe Gallager Converse
The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationOn the Capacity Region of the Gaussian Z-channel
On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationTwo Applications of the Gaussian Poincaré Inequality in the Shannon Theory
Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on
More informationCapacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel
Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park
More informationThe Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel
The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel Stefano Rini, Daniela Tuninetti, and Natasha Devroye Department
More informationGeneralized Writing on Dirty Paper
Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland
More informationBounds and Capacity Results for the Cognitive Z-interference Channel
Bounds and Capacity Results for the Cognitive Z-interference Channel Nan Liu nanliu@stanford.edu Ivana Marić ivanam@wsl.stanford.edu Andrea J. Goldsmith andrea@wsl.stanford.edu Shlomo Shamai (Shitz) Technion
More informationOn Gaussian MIMO Broadcast Channels with Common and Private Messages
On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu
More informationAn Outer Bound for the Gaussian. Interference channel with a relay.
An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il
More informationThe Capacity Region of the Gaussian Cognitive Radio Channels at High SNR
The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract
More information820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY Stefano Rini, Daniela Tuninetti, and Natasha Devroye
820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY 2012 Inner and Outer Bounds for the Gaussian Cognitive Interference Channel and New Capacity Results Stefano Rini, Daniela Tuninetti,
More informationThe Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component
1 The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component Nan Liu, Ivana Marić, Andrea J. Goldsmith, Shlomo Shamai (Shitz) arxiv:0812.0617v1 [cs.it] 2 Dec 2008 Dept. of
More informationAppendix B Information theory from first principles
Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes
More informationOn Multiple User Channels with State Information at the Transmitters
On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu
More informationThe Effect upon Channel Capacity in Wireless Communications of Perfect and Imperfect Knowledge of the Channel
The Effect upon Channel Capacity in Wireless Communications of Perfect and Imperfect Knowledge of the Channel Muriel Medard, Trans. on IT 000 Reading Group Discussion March 0, 008 Intro./ Overview Time
More informationInformation Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results
Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel
More informationA Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels
A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu
More informationRandom Access: An Information-Theoretic Perspective
Random Access: An Information-Theoretic Perspective Paolo Minero, Massimo Franceschetti, and David N. C. Tse Abstract This paper considers a random access system where each sender can be in two modes of
More informationX 1 : X Table 1: Y = X X 2
ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationLecture 10: Broadcast Channel and Superposition Coding
Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional
More informationInformation Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem
Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE
More informationChapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye
Chapter 8: Differential entropy Chapter 8 outline Motivation Definitions Relation to discrete entropy Joint and conditional differential entropy Relative entropy and mutual information Properties AEP for
More informationLecture 2. Capacity of the Gaussian channel
Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN
More informationList of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding...
Contents Contents i List of Figures iv Acknowledgements vi Abstract 1 1 Introduction 2 2 Preliminaries 7 2.1 Superposition Coding........................... 7 2.2 Block Markov Encoding.........................
More informationChannel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter
Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter Bradford D. Boyle, John MacLaren Walsh, and Steven Weber Modeling & Analysis of Networks Laboratory
More informationLecture 1: The Multiple Access Channel. Copyright G. Caire 12
Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user
More informationThe Capacity Region of the Gaussian MIMO Broadcast Channel
0-0 The Capacity Region of the Gaussian MIMO Broadcast Channel Hanan Weingarten, Yossef Steinberg and Shlomo Shamai (Shitz) Outline Problem statement Background and preliminaries Capacity region of the
More informationMulticoding Schemes for Interference Channels
Multicoding Schemes for Interference Channels 1 Ritesh Kolte, Ayfer Özgür, Haim Permuter Abstract arxiv:1502.04273v1 [cs.it] 15 Feb 2015 The best known inner bound for the 2-user discrete memoryless interference
More informationOn the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels
On the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels Giuseppe Caire University of Southern California Los Angeles, CA, USA Email: caire@usc.edu Nihar
More informationEE 4TM4: Digital Communications II Scalar Gaussian Channel
EE 4TM4: Digital Communications II Scalar Gaussian Channel I. DIFFERENTIAL ENTROPY Let X be a continuous random variable with probability density function (pdf) f(x) (in short X f(x)). The differential
More informationA Comparison of Superposition Coding Schemes
A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA
More informationProblemsWeCanSolveWithaHelper
ITW 2009, Volos, Greece, June 10-12, 2009 ProblemsWeCanSolveWitha Haim Permuter Ben-Gurion University of the Negev haimp@bgu.ac.il Yossef Steinberg Technion - IIT ysteinbe@ee.technion.ac.il Tsachy Weissman
More informationApproximate Capacity of Fast Fading Interference Channels with no CSIT
Approximate Capacity of Fast Fading Interference Channels with no CSIT Joyson Sebastian, Can Karakus, Suhas Diggavi Abstract We develop a characterization of fading models, which assigns a number called
More informationAN INTRODUCTION TO SECRECY CAPACITY. 1. Overview
AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits
More informationOn the Secrecy Capacity of Fading Channels
On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University
More informationA Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying
A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,
More informationOn the K-user Cognitive Interference Channel with Cumulative Message Sharing Sum-Capacity
03 EEE nternational Symposium on nformation Theory On the K-user Cognitive nterference Channel with Cumulative Message Sharing Sum-Capacity Diana Maamari, Daniela Tuninetti and Natasha Devroye Department
More informationInterference Channels with Source Cooperation
Interference Channels with Source Cooperation arxiv:95.319v1 [cs.it] 19 May 29 Vinod Prabhakaran and Pramod Viswanath Coordinated Science Laboratory University of Illinois, Urbana-Champaign Urbana, IL
More informationOn Scalable Source Coding for Multiple Decoders with Side Information
On Scalable Source Coding for Multiple Decoders with Side Information Chao Tian School of Computer and Communication Sciences Laboratory for Information and Communication Systems (LICOS), EPFL, Lausanne,
More informationSecrecy in the 2-User Symmetric Interference Channel with Transmitter Cooperation: Deterministic View
Secrecy in the 2-User Symmetric Interference Channel with Transmitter Cooperation: Deterministic View P. Mohapatra 9 th March 2013 Outline Motivation Problem statement Achievable scheme 1 Weak interference
More informationOptimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels
Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Bike ie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract Certain degraded broadcast channels
More informationOn the Secrecy Capacity of the Z-Interference Channel
On the Secrecy Capacity of the Z-Interference Channel Ronit Bustin Tel Aviv University Email: ronitbustin@post.tau.ac.il Mojtaba Vaezi Princeton University Email: mvaezi@princeton.edu Rafael F. Schaefer
More informationLattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function
Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More informationInterference Channel aided by an Infrastructure Relay
Interference Channel aided by an Infrastructure Relay Onur Sahin, Osvaldo Simeone, and Elza Erkip *Department of Electrical and Computer Engineering, Polytechnic Institute of New York University, Department
More informationOn the Rate-Limited Gelfand-Pinsker Problem
On the Rate-Limited Gelfand-Pinsker Problem Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 ravit@umd.edu ulukus@umd.edu Abstract
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationOutage-Efficient Downlink Transmission Without Transmit Channel State Information
1 Outage-Efficient Downlink Transmission Without Transmit Channel State Information Wenyi Zhang, Member, IEEE, Shivaprasad Kotagiri, Student Member, IEEE, and J. Nicholas Laneman, Senior Member, IEEE arxiv:0711.1573v1
More informationEE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm
EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately
More informationPrimary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel
Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Stefano Rini, Ernest Kurniawan and Andrea Goldsmith Technische Universität München, Munich, Germany, Stanford University,
More informationFading Wiretap Channel with No CSI Anywhere
Fading Wiretap Channel with No CSI Anywhere Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 7 pritamm@umd.edu ulukus@umd.edu Abstract
More informationVariable Rate Channel Capacity. Jie Ren 2013/4/26
Variable Rate Channel Capacity Jie Ren 2013/4/26 Reference This is a introduc?on of Sergio Verdu and Shlomo Shamai s paper. Sergio Verdu and Shlomo Shamai, Variable- Rate Channel Capacity, IEEE Transac?ons
More informationThe Capacity Region of a Class of Discrete Degraded Interference Channels
The Capacity Region of a Class of Discrete Degraded Interference Channels Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@umd.edu
More informationNational University of Singapore Department of Electrical & Computer Engineering. Examination for
National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:
More informationTightened Upper Bounds on the ML Decoding Error Probability of Binary Linear Block Codes and Applications
on the ML Decoding Error Probability of Binary Linear Block Codes and Moshe Twitto Department of Electrical Engineering Technion-Israel Institute of Technology Haifa 32000, Israel Joint work with Igal
More informationAn Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets
An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets Jing Guo University of Cambridge jg582@cam.ac.uk Jossy Sayir University of Cambridge j.sayir@ieee.org Minghai Qin
More informationShannon meets Wiener II: On MMSE estimation in successive decoding schemes
Shannon meets Wiener II: On MMSE estimation in successive decoding schemes G. David Forney, Jr. MIT Cambridge, MA 0239 USA forneyd@comcast.net Abstract We continue to discuss why MMSE estimation arises
More informationRelay Networks With Delays
Relay Networks With Delays Abbas El Gamal, Navid Hassanpour, and James Mammen Department of Electrical Engineering Stanford University, Stanford, CA 94305-9510 Email: {abbas, navid, jmammen}@stanford.edu
More informationCapacity Bounds for Diamond Networks
Technische Universität München Capacity Bounds for Diamond Networks Gerhard Kramer (TUM) joint work with Shirin Saeedi Bidokhti (TUM & Stanford) DIMACS Workshop on Network Coding Rutgers University, NJ
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationAnalytical Bounds on Maximum-Likelihood Decoded Linear Codes: An Overview
Analytical Bounds on Maximum-Likelihood Decoded Linear Codes: An Overview Igal Sason Department of Electrical Engineering, Technion Haifa 32000, Israel Sason@ee.technion.ac.il December 21, 2004 Background
More informationPrinciples of Communications
Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @
More informationCapacity of Fading Broadcast Channels with Limited-Rate Feedback
Capacity of Fading Broadcast Channels wi Limited-Rate Feedback Rajiv Agarwal and John Cioffi Department of Electrical Engineering Stanford University, CA-94305 Abstract In is paper, we study a fading broadcast
More informationCapacity bounds for multiple access-cognitive interference channel
Mirmohseni et al. EURASIP Journal on Wireless Communications and Networking, :5 http://jwcn.eurasipjournals.com/content///5 RESEARCH Open Access Capacity bounds for multiple access-cognitive interference
More informationResearch Article Multiaccess Channels with State Known to Some Encoders and Independent Messages
Hindawi Publishing Corporation EURASIP Journal on Wireless Communications and etworking Volume 2008, Article ID 450680, 14 pages doi:10.1155/2008/450680 Research Article Multiaccess Channels with State
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationOptimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels
Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Bike Xie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract arxiv:0811.4162v4 [cs.it] 8 May 2009
More informationRevision of Lecture 4
Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical
More information5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010
5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora
More informationBroadcasting over Fading Channelswith Mixed Delay Constraints
Broadcasting over Fading Channels with Mixed Delay Constraints Shlomo Shamai (Shitz) Department of Electrical Engineering, Technion - Israel Institute of Technology Joint work with Kfir M. Cohen and Avi
More informationSolutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality
1st Semester 2010/11 Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1. Convexity of capacity region of broadcast channel. Let C R 2 be the capacity region of
More informationLecture 8: Channel Capacity, Continuous Random Variables
EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel
More informationSecret Key Agreement Using Asymmetry in Channel State Knowledge
Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,
More informationA Summary of Multiple Access Channels
A Summary of Multiple Access Channels Wenyi Zhang February 24, 2003 Abstract In this summary we attempt to present a brief overview of the classical results on the information-theoretic aspects of multiple
More informationOuter Bounds on the Secrecy Capacity Region of the 2-user Z Interference Channel With Unidirectional Transmitter Cooperation
Outer Bounds on the Secrecy Capacity Region of the 2-user Z Interference Channel With Unidirectional Transmitter Cooperation Parthajit Mohapatra, Chandra R. Murthy, and Jemin Lee itrust, Centre for Research
More informationThe Effect of Memory Order on the Capacity of Finite-State Markov and Flat-Fading Channels
The Effect of Memory Order on the Capacity of Finite-State Markov and Flat-Fading Channels Parastoo Sadeghi National ICT Australia (NICTA) Sydney NSW 252 Australia Email: parastoo@student.unsw.edu.au Predrag
More informationSum Capacity of Gaussian Vector Broadcast Channels
Sum Capacity of Gaussian Vector Broadcast Channels Wei Yu, Member IEEE and John M. Cioffi, Fellow IEEE Abstract This paper characterizes the sum capacity of a class of potentially non-degraded Gaussian
More informationReliable Computation over Multiple-Access Channels
Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationUCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)
UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, 2017 Solutions to Take-Home Midterm (Prepared by Pinar Sen) 1. (30 points) Erasure broadcast channel. Let p(y 1,y 2 x) be a discrete
More informationExercise 1. = P(y a 1)P(a 1 )
Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a
More informationarxiv: v2 [cs.it] 28 May 2017
Feedback and Partial Message Side-Information on the Semideterministic Broadcast Channel Annina Bracher and Michèle Wigger arxiv:1508.01880v2 [cs.it] 28 May 2017 May 30, 2017 Abstract The capacity of the
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationDegrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages
Degrees of Freedom Region of the Gaussian MIMO Broadcast hannel with ommon and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and omputer Engineering University of Maryland, ollege
More informationSource-Channel Coding Theorems for the Multiple-Access Relay Channel
Source-Channel Coding Theorems for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora, and Deniz Gündüz Abstract We study reliable transmission of arbitrarily correlated sources over multiple-access
More informationarxiv: v1 [cs.it] 4 Jun 2018
State-Dependent Interference Channel with Correlated States 1 Yunhao Sun, 2 Ruchen Duan, 3 Yingbin Liang, 4 Shlomo Shamai (Shitz) 5 Abstract arxiv:180600937v1 [csit] 4 Jun 2018 This paper investigates
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationOn the Capacity of the Interference Channel with a Relay
On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due
More informationEnergy State Amplification in an Energy Harvesting Communication System
Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu
More informationOn the Duality of Gaussian Multiple-Access and Broadcast Channels
On the Duality of Gaussian ultiple-access and Broadcast Channels Xiaowei Jin I. INTODUCTION Although T. Cover has been pointed out in [] that one would have expected a duality between the broadcast channel(bc)
More information