Generalized Writing on Dirty Paper
|
|
- Joshua Charles
- 6 years ago
- Views:
Transcription
1 Generalized Writing on Dirty Paper Aaron S. Cohen MIT, Massachusetts Ave. Cambridge, MA Amos Lapidoth ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland Abstract We consider a generalization of Costa s writing on dirty paper model in which a powerlimited communicator encounters two independent sources of additive noise, one of which is known non-causally to the encoder. We seek to characterize when the capacity for this channel would not change if the source of noise known to the encoder were also known to the decoder; we call this property private-public equivalence (PPE). Costa showed that this model has PPE if both sources of noise are IID Gaussian. We show that this model has PPE as long as the unknown noise is Gaussian (but not necessarily IID) for any distribution on the known noise. We also conjecture that for a general class of coding strategies and for IID noise sequences (with some additional assumptions), if this model has PPE then the unknown noise must be Gaussian. This result relies on the Darmois-Skitovič Theorem, which states that linear combinations of independent random variables can be independent only for Gaussian random variables. 1 Introduction Costa s writing on dirty paper [1] is a communication model in which there are two independent sources of additive white Gaussian noise, one known non-causally to the encoder. Costa showed that a power constrained encoder can reliably transmit all rates less than 1 2 log ( 1 + P ) N bits/symbol, where P is the power constraint on the attacker and N is the variance of the unknown noise. Note the surprising fact that this capacity does not depend on the variance of the known noise (i.e., the achievable rates would not change if this noise were not present or if this noise were also known at the decoder and could be subtracted off). We generalize Costa s model by considering different distributions on the two noise sources. We show that the known noise does not affect the capacity for a Gaussian (possibly non-white) distribution on the unknown noise and for any distribution on the known noise; a similar result has been shown in [2], see below for discussion. We also conjecture that for a particular coding strategy known as distortion compensated quantization index modulation [3] and for independent, identically distributed (IID) noise sequences (with some additional assumptions) this is the most general condition under which this result holds (even allowing for a different input constraint than power). Other extensions to writing on dirty paper have been given. First, in [4] and [5], it was shown that the known noise does not affect the capacity as long as long as both noise sources are Gaussian, Submitted for presentation at ISIT Kindly direct all correspondence to A. Cohen. 1
2 S Z M Encoder X Y Decoder ˆM Figure 1: Generalized writing on dirty paper; S and Z are independent noise sequences; S is known non-causally to the encoder. but not necessarily white. Second, in [2], Erez, Shamai and Zamir extended this result to when the known noise is any deterministic sequence. This latter result is similar to our first main result. However, we present it here since the proof is significantly different. Writing on dirty paper and its extensions have gained interest in recent years due to its similarity to watermarking or information embedding, see e.g., [6, 3]. In these problems, an encoder wishes to transmit information by modifying a known data sequence (e.g., audio or video). The encoder does not wish to modify the data too much, hence a constraint similar to the power constraint above is placed on the encoder. Also as above, the decoder must recover the information in the presence of additional noise. The writing on dirty paper result has also been applied to Gaussian broadcast channels, see e.g., [7, 8]. For example, a broadcaster to two receivers can use its knowledge of the signal it wishes to send to the first receiver to design the signal it will send to the other receiver. There is no loss of capacity (due to the writing on dirty paper result) while each receiver only has to decode his own message. This contrasts with the usual superposition coding where one receiver must decode both messages. Thus, it is of interest to study the conditions under which Costa s result holds. The remainder of this paper is organized as follows. In Section 2, we give a detailed description of our generalized writing on dirty paper model and state our main results. In Sections 3 and 4, we sketch the proofs of our main results. 2 Model and Main Results We now describe a model we call generalized writing on dirty paper (GWDP) that we will use throughout the paper. The model is illustrated in Figure 1. Channel: The output of the channel is given by Y = X + S + Z, (1) where X is the input to the channel and S and Z are independent noise sequences with distributions P S and P Z, respectively. The noise sequence S will be available to the encoder, while the noise sequence Z will not. We use bold to signify a vector of length n. We also use upper case to signify random variables or vectors while lower case signifies realizations of random variables or vectors. All of the random variables and vectors we will discuss take value in R and R n, respectively. The distributions P S and P Z are specified for each n. 2
3 Encoder: For a given blocklength n, the encoder produces the input sequence X as a function of the entire known noise sequence S and an independent message M, which takes value uniformly in the set {1,..., 2 nr }. Here, R is the rate of the system. The input sequence must satisfy 1 n n d(x i ) D, a.s., (2) i=1 where d( ) is a non-negative function and a.s. stands for almost surely, i.e., with probability one. Decoder: The decoder observes the output Y and estimates the message with ˆM. We measure the resulting probability of error P e (n) = Pr( ˆM M) by averaging over all sources of randomness. Capacity: A rate R is achievable if there exists a sequence of rate-r encoders and associated decoders such that the probability of error P e (n) tends to zero as the blocklength n tends to infinity. The capacity is the supremum of all achievable rates. Private-Public Equivalence (PPE): As with watermarking [6], we refer to the situation described above as the public version, since the noise sequence S is only known at the encoder. We also consider a private version in which the sequence S is known to the decoder as well. We say that GWDP has private-public equivalence (PPE) if the capacities of both versions are the same. The capacity of the private version is the same as the capacity of an additive noise channel Y = X +Z, where X must satisfy (2). Costa s result is that GWDP has PPE if both S and Z are IID Gaussian and d(x) = x 2. We now characterize a more general situation in which this is true. Theorem 1. If Z is ergodic and Gaussian and d(x) = x 2, then GWDP has PPE, for any ergodic distribution on S. This theorem is proved in Section 3. A significantly different proof was found independently and concurrently in [2]. We next conjecture that for a general class of coding schemes and IID noise sequences, if GWDP has PPE, then then unknown noise must be Gaussian. The class of coding schemes that we are interested in is distortion compensated quantization index modulation (DC-QIM); this name was introduced in [3], but the same coding strategy has been used to achieve capacity originally by Costa [1] and also for watermarking with malicious attackers [6]. In DC- QIM, the input sequence X is formed as a linear combination of the known noise sequence S and a codeword U which depends on S and the message M. Conjecture 1. Let S and Z be IID and let S i and Z i satisfy the conditions under which Conjecture 2 is true (including S i not deterministic). If DC-QIM can be used to achieve PPE for GWDP (with finite capacity for both versions), then Z is IID Gaussian and d(x) x 2. The proof of this conjecture is given in Section 4, and is complete except for one sub-conjecture (Conjecture 2); we have not yet established the conditions under which this result is valid. Note that if S were deterministic then GWDP would trivially have PPE. Further note that by assuming that both versions have finite capacity there can be no z such that Pr(Z i = z) > 0. For simplicity, we further assume that each Z i has a density and that the differential entropy h(z i ) exists. We hope to complete the proof of this result and to extend it to be independent of the coding strategy. 3 Gaussian Unknown Noise is Sufficient We assume that the positive part of Conjecture 1 is true and give a quick proof of Theorem 1. The positive part of Conjecture 1 says that if S and Z are IID with Z Gaussian, then GWDP has 3
4 PPE. If Z is not IID, but still Gaussian, then we can diagonalize the problem and reduce it to a set of parallel scalar channels whose unknown noise component is IID Gaussian, see e.g., [9]. If S is not IID, but still ergodic, then we can interleave and make S IID on each sub-channel. Thus, it is sufficient to prove the positive part of Conjecture 1, which we do below in Section 4 (the only part that has an incomplete proof is the converse, i.e., that Z must be Gaussian). 4 Gaussian Unknown Noise is Necessary for DC-QIM In this section, we first show that if Z is IID Gaussian, S is IID, and d(x) = x 2, then GWDP has PPE. We then argue that if Z and S are IID and GWDP has PPE for DC-QIM, then Z must be Gaussian. We begin by characterizing when GWDP has PPE. Lemma 1. If GWDP has PPE for P S = (P S ) n and P Z = (P Z ) n, then there exists a conditional distribution P U,X S such that I(U; X + S + Z) I(U; S) = max P X : E[d(X )] D I(X ; X + Z), (3) and E[d(X)] D, where the mutual informations are evaluated with respect to the joint distribution P S,U,X,Z = P S P U,X S P Z. Proof. The right hand side (RHS) of (3) is the capacity of the private version of GWDP; all smaller rates are achievable by subtracting off S at the encoder and decoder. For finite alphabets and no distortion constraints, it was shown in [10] that a rate R is achievable if and only if there exists such a conditional distribution with the left hand side (LHS) of (3) greater than R. In [11], this result has been extended to include a distortion constraint. The proofs can be further extended so that they do not depend on the finiteness of the alphabets. The conditional distribution P U,X S describes the coding strategy that achieves all rates less than the LHS of (3). The marginal distribution P U is used to create the codewords and the conditional distribution P X U,S is used to form the input sequence x from the codeword u and the known noise sequence s. For DC-QIM, the conditional distribution P X U,S ( u, s) places a unit mass at u αs for some constant α. This corresponds to the DC-QIM algorithm in which the input sequence is a linear combination of the codeword and the known noise sequence. We first find general conditions on P U,X S under which (3) is met with equality, and then later restrict to DC-QIM. The fact that the LHS of (3) is at most the RHS can be seen through the following steps: I(U; X + S + Z) I(U; S) I(U; S + X + Z, S) I(U; S) (4) = I(U; S + X + Z S) I(X; S + X + Z S) (5) = h(s + X + Z S) h(s + X + Z X, S) = h(x + Z S) h(x + Z X) (6) h(x + Z) h(x + Z X) (7) = I(X; X + Z) max P X : E[d(X )] D I(X ; X + Z). (8) 4
5 Here, (5) follows by the data processing inequality since U (X, S) X + S + Z form a Markov chain; the differential entropies since we are assuming that Z has a density; (6) follows since h(s +X +Z X, S) = h(z X, S) = h(z) = h(z X) = h(x +Z X); and (7) follows since conditioning reduces entropy. The conditions for equality in (4), (5), (7) and (8) are respectively, A. The random variables U S + X + Z S form a Markov chain. B. The random variables X (S, U) S + X + Z form a Markov chain. C. The random variables S and X + Z are independent. D. The random variable X has the maximizing distribution on the RHS of (8). The above conditions are necessary in order for GWDP to have PPE for IID noise sequences. Let us assume that Z is Gaussian (with mean zero and variance N) and d(x) = x 2 and describe a conditional distribution P U,X S such that these four conditions are satisfied. Let X be Gaussian D D+N. (with mean zero and variance D) and independent of S and let U = X + αs, where α = We note that X α(x + Z) and X + Z are independent since they are jointly Gaussian and uncorrelated. Condition A is satisfied since I(U; S S + X + Z) = h ( X + αs S + X + Z ) h ( X + αs S + X + Z, S ) = h ( X α(x + Z) S + X + Z ) h ( X α(x + Z) S + X + Z, S ) = h ( X α(x + Z) ) h ( X α(x + Z) ) = 0, where the final equality follows since X α(x +Z) is independent of both S and X +Z. Condition B is satisfied since X is a function of S and U. Condition C is satisfied since S is independent of both X and Z. Condition D is satisfied since a Gaussian distribution is capacity achieving on a power-limited AWGN channel. Thus, if Z is IID Gaussian, S is IID, and d(x) = x 2, then GWDP has PPE. We now argue that in order for conditions A D to hold with U = X + αs for some α (the constraint from DC-QIM), the unknown noise Z must be Gaussian. In order to do so, we shall use the following three technical claims, the first of which can be found in [12] 1 and the last of which is the only unproven part of the argument. Lemma 2 (Darmois-Skitovič Theorem). If X 1,..., X n are independent real random variables and if for constants (a 1,..., a n ) and (b 1,..., b n ), n i=1 a ix i is independent of n i=1 b ix i, then for all i with a i b i 0, X i has a Gaussian distribution. Lemma 3. For real random variables X 1, X 2, and X 3, if (X 1, X 2 ) and X 3 are independent and X 1 and X 2 + X 3 are independent, then X 1 and X 2 are independent. Proof. We shall use the joint moment generating function (MGF), which we denote g x (r) = E [ e jr X]. Recall that X 1 and X 2 are independent if and only if their joint MGF factors, i.e., iff g x1,x 2 (r 1, r 2 ) = g x1 (r 1 )g x2 (r 2 ). By our assumption that X 1 and X 2 + X 3 are independent, g x1,x 2,x 3 (r 1, r, r) = g x1 (r 1 )g x2,x 3 (r, r), for any (r 1, r). Furthermore, since X 2 and X 3 are independent, g x2,x 3 (r, r) = g x2 (r)g x3 (r). By our assumption that (X 1, X 2 ) and X 3 are independent, g x1,x 2,x 3 (r 1, r 2, r 3 ) = g x1,x 2 (r 1, r 2 )g x3 (r 3 ), for any (r 1, r 2, r 3 ), and in particular for r 2 = r 3 = r. Comparing these two factorizations of the joint MGF of X 1, X 2, and X 3, we see that g x1,x 2 (r 1, r 2 ) = g x1 (r 1 )g x2 (r 2 ), and thus X 1 and X 2 are independent. 1 Thanks to Randy Berry for pointing out this result. 5
6 Conjecture 2. Under some natural conditions on the real random variables X 1, X 2, and X 3, if (X 1, X 2 ) and X 3 are independent and X 1 X 2 + X 3 X 2 form a Markov chain, then X 1 and X 2 are independent. Remark: We shall invoke this result with X 1 = (1 α)x αz, X 2 = X + Z and X 3 = S. Clearly, if X 3 is deterministic or, more generally, if X 2 can be determined from X 2 +X 3, then the statement is not true. However, if X 3 takes value on the entire real line, then we believe that the statement is true. We are looking for the smallest set of assumptions under which this conjecture is true. We now show that Gaussian unknown noise is necessary for the scenario under consideration. First, since (X, S) and Z are independent (by definition) and S and X + Z are independent (condition C), Lemma 3 shows that X and S must be independent. Second, condition A is equivalent to the random variables (1 α)x αz X + S + Z X + Z forming a Markov chain for some α (to see this, subtract α(x + S + Z) and X + S + Z from the left-most and right-most, respectively, random variables in condition C). Using Conjecture 2, these two facts imply that X + Z and (1 α)x αz must be independent for some α. Note that α = 0 cannot give equality in (3) since I(X; X + S + Z) < I(X; X + Z) for S independent of (X, Z). Thus, (1 α)x αz and X + Z must be independent for some α 0, and using Lemma 2, we see that Z must be Gaussian in order for GWDP to have PPE. Furthermore (as long as α 1), the input X must be Gaussian as well and the only constraint where Gaussian input is optimal for an additive Gaussian channel is (up to a constant) d(x) = x 2. Thus, assuming Conjecture 2, we have shown that Conjecture 1 is true. References [1] M. H. M. Costa, Writing on dirty paper, IEEE Trans. IT, vol. 29, pp , May [2] U. Erez, S. Shamai, and R. Zamir, Capacity and lattice-strategies for cancelling known interference, in Proc. of the Cornell Summer Workshop on Inform. Theory, Aug [3] B. Chen and G. W. Wornell, Quantization index modulation: A class of provably good methods for digital watermarking and information embedding, IEEE Trans. IT, vol. 47, pp , May [4] B. Chen, Design and Analysis of Digital Watermarking, Information Embedding, and Data Hiding Systems. PhD thesis, MIT, Cambridge, MA, [5] W. Yu, A. Sutivong, D. Julian, T. M. Cover, and M. Chiang, Writing on colored paper, in Proc. of ISIT, (Washington, DC), [6] A. S. Cohen and A. Lapidoth, The Gaussian watermarking game. To appear in IEEE Trans. IT. [7] G. Caire and S. Shamai, On achievable rates in a multi-access Gaussian broadcast channel, in Proc. of ISIT, (Washington, DC), p. 147, [8] W. Yu and J. M. Cioffi, Trellis precoding for the broadcast channel, in Proc. of GlobeCom, [9] W. Hirt and J. L. Massey, Capacity of the discrete-time Gaussian channel with intersymbol interference, IEEE Trans. IT, vol. 34, no. 3, pp , [10] S. I. Gel fand and M. S. Pinsker, Coding for channel with random parameters, Problems of Control and Inform. Theory, vol. 9, no. 1, pp , [11] R. J. Barron, B. Chen, and G. W. Wornell, The duality between information embedding and source coding with side information and some applications, in Proc. of ISIT, (Washington, DC), [12] R. M. Dudley, Uniform Central Limit Theorems. Cambridge University Press,
The Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationSparse Regression Codes for Multi-terminal Source and Channel Coding
Sparse Regression Codes for Multi-terminal Source and Channel Coding Ramji Venkataramanan Yale University Sekhar Tatikonda Allerton 2012 1 / 20 Compression with Side-Information X Encoder Rate R Decoder
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationDuality Between Channel Capacity and Rate Distortion With Two-Sided State Information
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 1629 Duality Between Channel Capacity Rate Distortion With Two-Sided State Information Thomas M. Cover, Fellow, IEEE, Mung Chiang, Student
More informationThe Duality Between Information Embedding and Source Coding With Side Information and Some Applications
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 5, MAY 2003 1159 The Duality Between Information Embedding and Source Coding With Side Information and Some Applications Richard J. Barron, Member,
More informationShannon meets Wiener II: On MMSE estimation in successive decoding schemes
Shannon meets Wiener II: On MMSE estimation in successive decoding schemes G. David Forney, Jr. MIT Cambridge, MA 0239 USA forneyd@comcast.net Abstract We continue to discuss why MMSE estimation arises
More informationDirty Paper Writing and Watermarking Applications
Dirty Paper Writing and Watermarking Applications G.RaviKiran February 10, 2003 1 Introduction Following an underlying theme in Communication is the duo of Dirty Paper Writing and Watermarking. In 1983
More informationOn Compound Channels With Side Information at the Transmitter
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 4, APRIL 2006 1745 On Compound Channels With Side Information at the Transmitter Patrick Mitran, Student Member, IEEE, Natasha Devroye, Student Member,
More informationAn Information-Theoretic Analysis of Dirty Paper Coding for Informed Audio Watermarking
1 An Information-Theoretic Analysis of Dirty Paper Coding for Informed Audio Watermarking Andrea Abrardo, Mauro Barni, Andrea Gorrieri, Gianluigi Ferrari Department of Information Engineering and Mathematical
More informationACOMMUNICATION situation where a single transmitter
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 9, SEPTEMBER 2004 1875 Sum Capacity of Gaussian Vector Broadcast Channels Wei Yu, Member, IEEE, and John M. Cioffi, Fellow, IEEE Abstract This paper
More informationSum Capacity of Gaussian Vector Broadcast Channels
Sum Capacity of Gaussian Vector Broadcast Channels Wei Yu, Member IEEE and John M. Cioffi, Fellow IEEE Abstract This paper characterizes the sum capacity of a class of potentially non-degraded Gaussian
More informationOn the Duality of Gaussian Multiple-Access and Broadcast Channels
On the Duality of Gaussian ultiple-access and Broadcast Channels Xiaowei Jin I. INTODUCTION Although T. Cover has been pointed out in [] that one would have expected a duality between the broadcast channel(bc)
More informationLecture 10: Broadcast Channel and Superposition Coding
Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional
More informationComputation of Information Rates from Finite-State Source/Channel Models
Allerton 2002 Computation of Information Rates from Finite-State Source/Channel Models Dieter Arnold arnold@isi.ee.ethz.ch Hans-Andrea Loeliger loeliger@isi.ee.ethz.ch Pascal O. Vontobel vontobel@isi.ee.ethz.ch
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationCompetition and Cooperation in Multiuser Communication Environments
Competition and Cooperation in Multiuser Communication Environments Wei Yu Electrical Engineering Department Stanford University April, 2002 Wei Yu, Stanford University Introduction A multiuser communication
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationX 1 Q 3 Q 2 Y 1. quantizers. watermark decoder N S N
Quantizer characteristics important for Quantization Index Modulation Hugh Brunk Digimarc, 19801 SW 72nd Ave., Tualatin, OR 97062 ABSTRACT Quantization Index Modulation (QIM) has been shown to be a promising
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationBroadcasting over Fading Channelswith Mixed Delay Constraints
Broadcasting over Fading Channels with Mixed Delay Constraints Shlomo Shamai (Shitz) Department of Electrical Engineering, Technion - Israel Institute of Technology Joint work with Kfir M. Cohen and Avi
More informationWE study the capacity of peak-power limited, single-antenna,
1158 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 3, MARCH 2010 Gaussian Fading Is the Worst Fading Tobias Koch, Member, IEEE, and Amos Lapidoth, Fellow, IEEE Abstract The capacity of peak-power
More informationLattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function
Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,
More informationCapacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel
Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park
More informationA Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels
A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu
More informationEnergy State Amplification in an Energy Harvesting Communication System
Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu
More informationPrimary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel
Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Stefano Rini, Ernest Kurniawan and Andrea Goldsmith Technische Universität München, Munich, Germany, Stanford University,
More informationDuality, Achievable Rates, and Sum-Rate Capacity of Gaussian MIMO Broadcast Channels
2658 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 49, NO 10, OCTOBER 2003 Duality, Achievable Rates, and Sum-Rate Capacity of Gaussian MIMO Broadcast Channels Sriram Vishwanath, Student Member, IEEE, Nihar
More information18.2 Continuous Alphabet (discrete-time, memoryless) Channel
0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not
More informationStructured interference-mitigation in two-hop networks
tructured interference-mitigation in two-hop networks Yiwei ong Department of Electrical and Computer Eng University of Illinois at Chicago Chicago, IL, UA Email: ysong34@uicedu Natasha Devroye Department
More informationIN this paper, we show that the scalar Gaussian multiple-access
768 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 5, MAY 2004 On the Duality of Gaussian Multiple-Access and Broadcast Channels Nihar Jindal, Student Member, IEEE, Sriram Vishwanath, and Andrea
More informationThe Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component
1 The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component Nan Liu, Ivana Marić, Andrea J. Goldsmith, Shlomo Shamai (Shitz) arxiv:0812.0617v1 [cs.it] 2 Dec 2008 Dept. of
More informationBounds and Capacity Results for the Cognitive Z-interference Channel
Bounds and Capacity Results for the Cognitive Z-interference Channel Nan Liu nanliu@stanford.edu Ivana Marić ivanam@wsl.stanford.edu Andrea J. Goldsmith andrea@wsl.stanford.edu Shlomo Shamai (Shitz) Technion
More informationCapacity of Memoryless Channels and Block-Fading Channels With Designable Cardinality-Constrained Channel State Feedback
2038 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 9, SEPTEMBER 2004 Capacity of Memoryless Channels and Block-Fading Channels With Designable Cardinality-Constrained Channel State Feedback Vincent
More informationEE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.
EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on
More informationCovert Communication with Channel-State Information at the Transmitter
Covert Communication with Channel-State Information at the Transmitter Si-Hyeon Lee Joint Work with Ligong Wang, Ashish Khisti, and Gregory W. Wornell July 27, 2017 1 / 21 Covert Communication Transmitter
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationarxiv: v1 [cs.it] 4 Jun 2018
State-Dependent Interference Channel with Correlated States 1 Yunhao Sun, 2 Ruchen Duan, 3 Yingbin Liang, 4 Shlomo Shamai (Shitz) 5 Abstract arxiv:180600937v1 [csit] 4 Jun 2018 This paper investigates
More informationOn the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels
On the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels Giuseppe Caire University of Southern California Los Angeles, CA, USA Email: caire@usc.edu Nihar
More informationOn the Capacity of the Interference Channel with a Relay
On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due
More information5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010
5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationLECTURE 13. Last time: Lecture outline
LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to
More informationOutage-Efficient Downlink Transmission Without Transmit Channel State Information
1 Outage-Efficient Downlink Transmission Without Transmit Channel State Information Wenyi Zhang, Member, IEEE, Shivaprasad Kotagiri, Student Member, IEEE, and J. Nicholas Laneman, Senior Member, IEEE arxiv:0711.1573v1
More informationVariable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More informationDirty Paper Coding vs. TDMA for MIMO Broadcast Channels
TO APPEAR IEEE INTERNATIONAL CONFERENCE ON COUNICATIONS, JUNE 004 1 Dirty Paper Coding vs. TDA for IO Broadcast Channels Nihar Jindal & Andrea Goldsmith Dept. of Electrical Engineering, Stanford University
More informationLecture 8: Channel Capacity, Continuous Random Variables
EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel
More informationCognitive Multiple Access Networks
Cognitive Multiple Access Networks Natasha Devroye Email: ndevroye@deas.harvard.edu Patrick Mitran Email: mitran@deas.harvard.edu Vahid Tarokh Email: vahid@deas.harvard.edu Abstract A cognitive radio can
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationOn Scalable Coding in the Presence of Decoder Side Information
On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,
More informationX 1 : X Table 1: Y = X X 2
ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access
More informationOn the Capacity Region of the Gaussian Z-channel
On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu
More informationThe Capacity Region of the Gaussian MIMO Broadcast Channel
0-0 The Capacity Region of the Gaussian MIMO Broadcast Channel Hanan Weingarten, Yossef Steinberg and Shlomo Shamai (Shitz) Outline Problem statement Background and preliminaries Capacity region of the
More informationChapter 9 Fundamental Limits in Information Theory
Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For
More informationOn Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti
DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 1 On Two-user Fading Gaussian Broadcast Channels with Perfect Channel State Information at the Receivers Daniela Tuninetti Mobile
More informationInterference Channel aided by an Infrastructure Relay
Interference Channel aided by an Infrastructure Relay Onur Sahin, Osvaldo Simeone, and Elza Erkip *Department of Electrical and Computer Engineering, Polytechnic Institute of New York University, Department
More informationThe Capacity Region of the Gaussian Cognitive Radio Channels at High SNR
The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract
More informationOn the Secrecy Capacity of the Z-Interference Channel
On the Secrecy Capacity of the Z-Interference Channel Ronit Bustin Tel Aviv University Email: ronitbustin@post.tau.ac.il Mojtaba Vaezi Princeton University Email: mvaezi@princeton.edu Rafael F. Schaefer
More informationBinary Dirty MAC with Common State Information
Binary Dirty MAC with Common State Information Anatoly Khina Email: anatolyk@eng.tau.ac.il Tal Philosof Email: talp@eng.tau.ac.il Ram Zamir Email: zamir@eng.tau.ac.il Uri Erez Email: uri@eng.tau.ac.il
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationInformation Theory Meets Game Theory on The Interference Channel
Information Theory Meets Game Theory on The Interference Channel Randall A. Berry Dept. of EECS Northwestern University e-mail: rberry@eecs.northwestern.edu David N. C. Tse Wireless Foundations University
More informationApproximately achieving the feedback interference channel capacity with point-to-point codes
Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used
More informationTHE dirty-paper (DP) channel, first introduced by
On the obustness of Dirty Paper Coding Anatoly Khina and Uri Erez, Member, IEEE Abstract A dirty-paper channel is considered, where the transmitter knows the interference sequence up to a constant multiplicative
More informationQuantization Index Modulation using the E 8 lattice
1 Quantization Index Modulation using the E 8 lattice Qian Zhang and Nigel Boston Dept of Electrical and Computer Engineering University of Wisconsin Madison 1415 Engineering Drive, Madison, WI 53706 Email:
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationOn Multiple User Channels with State Information at the Transmitters
On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu
More informationAppendix B Information theory from first principles
Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes
More informationOn the Capacity of Free-Space Optical Intensity Channels
On the Capacity of Free-Space Optical Intensity Channels Amos Lapidoth TH Zurich Zurich, Switzerl mail: lapidoth@isi.ee.ethz.ch Stefan M. Moser National Chiao Tung University NCTU Hsinchu, Taiwan mail:
More informationOn Capacity of the Writing onto Fast Fading Dirt Channel
On Capacity of the Writing onto Fast Fading Dirt Channel Stefano Rini and Shlomo Shamai (Shitz) arxiv:606.06039v [cs.it] 7 Jul 07 Abstract The Writing onto Fast Fading Dirt (WFFD) channel is investigated
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationAn Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel
Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and
More informationFeedback Capacity of the Compound Channel
Feedback Capacity of the Compound Channel The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published Publisher Shrader,
More informationOn Common Information and the Encoding of Sources that are Not Successively Refinable
On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa
More informationCut-Set Bound and Dependence Balance Bound
Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems
More informationMulticoding Schemes for Interference Channels
Multicoding Schemes for Interference Channels 1 Ritesh Kolte, Ayfer Özgür, Haim Permuter Abstract arxiv:1502.04273v1 [cs.it] 15 Feb 2015 The best known inner bound for the 2-user discrete memoryless interference
More informationCapacity bounds for multiple access-cognitive interference channel
Mirmohseni et al. EURASIP Journal on Wireless Communications and Networking, :5 http://jwcn.eurasipjournals.com/content///5 RESEARCH Open Access Capacity bounds for multiple access-cognitive interference
More informationIncremental Coding over MIMO Channels
Model Rateless SISO MIMO Applications Summary Incremental Coding over MIMO Channels Anatoly Khina, Tel Aviv University Joint work with: Yuval Kochman, MIT Uri Erez, Tel Aviv University Gregory W. Wornell,
More informationA Comparison of Two Achievable Rate Regions for the Interference Channel
A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg
More informationThe Gallager Converse
The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing
More informationTwo Applications of the Gaussian Poincaré Inequality in the Shannon Theory
Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on
More informationInformation Embedding meets Distributed Control
Information Embedding meets Distributed Control Pulkit Grover, Aaron B Wagner and Anant Sahai Abstract We consider the problem of information embedding where the encoder modifies a white Gaussian host
More informationMMSE estimation and lattice encoding/decoding for linear Gaussian channels. Todd P. Coleman /22/02
MMSE estimation and lattice encoding/decoding for linear Gaussian channels Todd P. Coleman 6.454 9/22/02 Background: the AWGN Channel Y = X + N where N N ( 0, σ 2 N ), 1 n ni=1 X 2 i P X. Shannon: capacity
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More informationInterference Channels with Source Cooperation
Interference Channels with Source Cooperation arxiv:95.319v1 [cs.it] 19 May 29 Vinod Prabhakaran and Pramod Viswanath Coordinated Science Laboratory University of Illinois, Urbana-Champaign Urbana, IL
More informationA Formula for the Capacity of the General Gel fand-pinsker Channel
A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore
More informationNational University of Singapore Department of Electrical & Computer Engineering. Examination for
National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:
More informationSource-Channel Coding Theorems for the Multiple-Access Relay Channel
Source-Channel Coding Theorems for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora, and Deniz Gündüz Abstract We study reliable transmission of arbitrarily correlated sources over multiple-access
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationAn Outer Bound for the Gaussian. Interference channel with a relay.
An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il
More informationError Exponent Region for Gaussian Broadcast Channels
Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationQuantization Index Modulation: A Class of Provably Good Methods for Digital Watermarking and Information Embedding
To appear in IEEE Trans. Inform. Theory. Quantization Index Modulation: A Class of Provably Good Methods for Digital Watermarking and Information Embedding Brian Chen and Gregory W. Wornell Submitted June
More informationOn Gaussian MIMO Broadcast Channels with Common and Private Messages
On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu
More informationDispersion of the Gilbert-Elliott Channel
Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays
More informationJoint Compression and Digital Watermarking: Information-Theoretic Study and Algorithms Development
Joint Compression and Digital Watermarking: Information-Theoretic Study and Algorithms Development by Wei Sun A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for
More informationOn the Capacity of the Multiple Antenna Broadcast Channel
DIMACS Series in Discrete Mathematics and Theoretical Computer Science On the Capacity of the Multiple Antenna Broadcast Channel David Tse and Pramod Viswanath Abstract. The capacity region of the multiple
More informationMultiuser Successive Refinement and Multiple Description Coding
Multiuser Successive Refinement and Multiple Description Coding Chao Tian Laboratory for Information and Communication Systems (LICOS) School of Computer and Communication Sciences EPFL Lausanne Switzerland
More informationLecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157
Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x
More informationJoint Source-Channel Coding for the Multiple-Access Relay Channel
Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il
More informationJoint Write-Once-Memory and Error-Control Codes
1 Joint Write-Once-Memory and Error-Control Codes Xudong Ma Pattern Technology Lab LLC, U.S.A. Email: xma@ieee.org arxiv:1411.4617v1 [cs.it] 17 ov 2014 Abstract Write-Once-Memory (WOM) is a model for many
More information