On Gaussian Interference Channels with Mixed Gaussian and Discrete Inputs
|
|
- Melvin Walker
- 5 years ago
- Views:
Transcription
1 On Gaussian nterference Channels with Mixed Gaussian and Discrete nputs Alex Dytso Natasha Devroye and Daniela Tuninetti University of llinois at Chicago Chicago L USA odytso devroye uic.edu Abstract This paper studies the sum-rate of a class of memoryless real-valued additive white Gaussian noise interference channels C achievable by treating interference as noise TN. We develop and analytically characterize the rates achievable by a new strategy that uses superpositions of Gaussian and discrete random variables as channel inputs. Surprisingly we demonstrate that TN is sum-generalized degrees of freedom optimal and can achieve to within an additive gap of O or Olog logsnr to the symmetric sum-capacity of the classical C. We also demonstrate connections to other channels such as the C with partial codebook knowledge and the block asynchronous C.. NTRODUCTON Consider a memoryless real-valued additive white Gaussian noise interference channel with input-output relationship Y n = h X n + h X n + Z n a Y n = h X n + h X n + Z n b where Xj n := X j X jn and Yj n := Y j Y jn are the length-n vector inputs and outputs respectively for user j : ] the noise vectors Zj n are independent and have i.i.d. zero-mean unit-variance Gaussian components and the inputs Xj n is subject to a per-block power constraint n n i= X ji for j : ]. nput Xj n j : ] is a function of the independent message W j that is uniformly distributed on : nrj ] where R j is the rate and n the block length. Receiver j : ] wishes to recover W j from the channel output Yj n with arbitrary small probability of error. Achievable rates and capacity region are defined in the usual way ]. n this work we shall focus on the sum-capacity defined as the largest achievable R + R. t was shown by ] that the sum-capacity of any information stable 3] C is given by max X n NP X n X n =P X n P n ; Y n + X n ; Y n. X n n For the Gaussian noise channel in the maximization in is further restricted to inputs satisfying the power constraint. For sake of simplicity we shall focus from now on on the symmetric Gaussian C only defined as h = h = S 0 h = h = 0 and denote its sumcapacity in as CS. n general little is known about the optimizing distribution in and only some special cases have been solved. n 6] it was showed that i.i.d. Gaussian inputs maximize for S. n contrast the authors of 7] show that in general Gaussian inputs do not maximize expressions of the form of. The difficulty of the problem in arises from the competitive nature of the problem 8]: for example say X is i.i.d. Gaussian taking X to be Gaussian increases X n ; Y n but simultaneously decreases X n ; Y n as Gaussians are known to be the best inputs for Gaussian point-to-point channels but are also the worst type of noise for a Gaussian input. A lower bound to the sum-capacity can be obtained by considering i.i.d inputs in thus giving R L S = max X ; Y + X ; Y 3 P X X =P X P X where the maximization in 3 is further restricted to inputs satisfying the power constraint. n 8] 5] the authors demonstrated the existence of input distributions that outperform i.i.d. Gaussian inputs in R L S for certain asynchronous C. Both works use local perturbations of an i.i.d. Gaussian input: 5 Lemma 3] considers a fourth order approximation of mutual information while 8 Theorem ] uses perturbation in the direction of Hermite polynomials of order larger than three. n both cases the input distribution is assumed to have a density. For the cases reported in 5] 8] the improvement over i.i.d. Gaussian inputs shows in the decimal digits of the achievable rates; it is hence not clear that these inputs can actually provide substantial rate gains compared to Gaussian inputs. Recently in 9] for the C with one oblivious receiver we showed that a properly chosen discrete input has a somewhat different behavior than continuous inputs: it may yield a good X ; Y while keeping X ; Y relatively unchanged thus substantially improving the rates compared to using Gaussian inputs in the same achievable region expression. n this work we seek to analytically evaluate the lower bound in 3 for a special class of mixed Gaussian and discrete inputs by generalizing the approach of 9]. Our contributions and paper organization are as follows. n Section we present the main tools used in our analysis: a lower bound on the the mutual information attained by a discrete input on a point-to-point Gaussian noise channel and tools to compute the cardinality and minimum distances of sum-sets. n Section we present an achievable sum-rate valid for any memoryless C obtained by evaluating R L S with an input that consists of the superposition of discrete and Gaussian components which we term mixed input from now on. n Section V we show that in terms of generalized degrees of freedom gdof mixed inputs in R L S achieve
2 the optimal CS. n Section V we show that at any finite S R L S with mixed inputs lies to within a constant gap of O or of loglogs from CS. n Section V we apply our results to the block asynchronous C and the oblivious C. Section V concludes the paper. We use the following notation convention: if A is a random variable r.v. we denote its support by suppa; A is the cardinality of a set A or cardinality of suppa if A is r.v.; the symbol d mins = min i j:sis j S s i s j denotes the minimum distance among the points in the set S. With some abuse of notation we also use d mina to denote d minsuppa for a r.v. A; X N µ σ denotes a real-valued Gaussian r.v. X with mean µ and variance σ ; log denotes logarithms in base and ln in base e; we let x] + := maxx 0 and log + x := logx] + ; x refers to largest integer less than or equal to x; PAMN d min denotes the uniform distribution over a zero-mean real-valued Pulse Amplitude Modulation PAM constellation with N points and minimum distance d min and average energy E = d min N. n the following we will compare R L S to an upper bound on CS denoted by R U S from the classical C with full codebook knowledge at all nodes and with block synchronous communication. R U S is therefore an upper bound to the capacity of block asynchronous or C with partial codebook knowledge. As mentioned earlier from 6] we have R L S = CS = log + S if S. Moreover CS R U S := min O O O 3 with O := log + S cut-set bound a O := log + S from 0] b + S O 3 := log + + log + S from ]. c. MAN TOOLS n this Section we present the main tools to evaluate the lower bound in 3 under mixed inputs. At the core of our proof is the following new lower bound on the rate achieved by a discrete input on a point-to-point Gaussian noise channel. Theorem 9 Theorem ]. Let X D be a discrete r.v. with N distinct masses and minimum distance d min Z G N 0 and S be a non-negative constant. Then X D ; SX D + Z G d N Sd min 5 where for N N and x R we define d N x := logn e log log + N e x] +. The inequality in 5 nicely captures the effect of the cardinality and minimum distance of the discrete constellation on the achievable mutual information. We are not the first to analytically consider discrete inputs. For point-to-point Gaussian noise channel: ] characterized the optimal discrete input distribution at high and low SNR; 3] found tight high-snr asymptotic of mutual information for any discrete constellation whose size N is independent of SNR; in ] the authors considered the point-to-point power-constrained Gaussian noise channel and derived lower bounds on the achievable rate when the input is constrained to be PAM. However in our work we need a firm lower bound that holds for all SNR s and all input distributions. The reason is that we want to carefully select N as a function of SNR a problem that was left open in ]. Note that we can not directly use ] as the sum of two PAMs is not necessarily another PAM. n multi-user settings we may wish to select one user s input as Gaussian another as discrete or both mixtures of discrete and Gaussian. To handle such scenarios based on 5 we need bounds on the cardinality and minimum distance of sums of discrete constellations. f X and Y are two sets denote the sum-set as X + Y := {x + y x X y Y }. Tight bounds on the cardinality and the minimum distance of X + Y for general X and Y are an open problem in number theory. The following set of sufficient conditions will play an important role in evaluating 3 under mixed inputs. Proposition. Let h x h y R be two constants X PAM X d minx and Y PAM Y d miny. Then d minhxx+h yy = min h x d minx h y d miny and h x X + h y Y = X Y if Y h y d miny h x d minx 6 or if X h x d minx h y d miny 7 Proof: The conditions in 6-7 are such that one PAM constellation is completely contained within two points of the other PAM constellation. When Proposition is not applicable we will use the following proposition inspired by 5]: Proposition. Let X and Y be two PAMN d min. Then h x X + h y X = X almost everywhere and d minhxx+h yy min h x h y min γ d min 8 for all h x h y E R where the complement of E has Lesgebue measure smaller than γ for any γ > 0. Proof: The proof can be found in Appendix A.. ACHEVABLE SUM-RATE WTH MXED NPUTS We now evaluate the lower bound in 3 with inputs X i = δ X id + δ X ig δ 0 ] 9a X id PAM N N X ig N 0 9b where X ij are independent for i : ] j {D G}. The input in 9 has two parameters: the number of points N since here we consider unitary power constraint and the power split
3 TABLE : Parameters used in the proof of Theorems and 3. α N δ d d mins L α Additive Gap 0 /] Not Applicable Not Applicable α bit 0] for S / /3] + ɛ S < +S+ + +S+ N ɛα eq.5 for 3 S + S + /3 + + S ɛ + mins +S+ 0 S N N min γ maxα α except on an outage set of measure γ eq.6 for 3 > S + S + < S + S ɛ eq. for S + S δ. Careful choices of these parameters will lead to the desired results in different regimes. The inputs in 9 are symmetric i.e. same parameters for both users since we restrict attention to a symmetric C. Extensions to a non-symmetric Cs are straightforward but more computationally involved. Proposition 3. For the input in 9 R L S may be further lower bounded as R L S d + log + Sδ min logn log S d mins + δ δ where the sum-set S is defined as { δ S := Sx D + x D : + Sδ + δ δ } x D X D x D X D Proof: Due to the symmetry of the problem X ; Y = X ; Y = R L S /; hence for a Z N 0 we have X ; Y = h X + SX + Z h X + Z δ = h Sx D + x D + Z hz + Sδ + δ = d S d mins by Theorem δ h XD + Z hz δ δ XD +δ +Z;X D minlogn δ log+ +δ + logδ + Sδ logδ where the mutual information upper bound follows since Gaussian maximizes the differential entropy for a given second moment constraint and a uniform input maximizes the entropy of a discrete random variable ]. V. HGH SNR PERFORMANCE n this Section we show that the lower bound in Proposition 3 attains the same generalized degrees of freedom gdof as the upper bound in where gdof is defined as R L S = S α d L α := lim S. 0 log + S Similarly define d U α by replacing R L by R U in 0. With the upper bound in we have 0] α d U α = min max α max α α. The main result of this Section is: Theorem. A mixed input as in Proposition 3 achieves the gdof summarized in Table. Proof: The parameters of the input in 9 are chosen as in Table. When α Gaussian inputs are optimal to within bit 0] hence we set δ =. For α / we choose δ such that the interference from the Gaussian portion of the input which is treated as noise is below the level of the noise 0] by setting δ = +. For α we choose to send only discrete inputs by setting δ = 0. n very strong interference S + S α we set δ = 0 and N = + S ɛ for some ɛ 0 where ɛ < insures a non-vanishing rate as S increases and ɛ > 0 insures that the minimum distance increases as S increases. With this δ and N the condition in 6 is N S which is readily verified in this regime. Therefore S = N and d mins = 3S N Sɛ from Proposition. By plugging these values in Proposition 3 an achievable sum-rate is R L d N 3S N min logn log e logn log log + N e 3S N. Finally in the high-snr limit we obtain eq. lim S 0.5 log+s = ɛ. The moderately weak interference regime α / /3] follows similarly: now the condition in Proposition is N S and the sum-rate is given by 3 see next because δ = +. For the remaining regime α /3 we set N = + S α/ and δ = + and therefore from Proposition S = N and d mins as in Table almost surely for all channel parameters. Here d mins increases in S i.e. no need for the ɛ parameter as in the previous regimes but the result holds for all channel parameters up to an outage set of measure less than γ where γ > 0 also affects the minimum distance. By plugging these values in Proposition 3
4 we can further lower bound the achievable sum-rate as e R L logn log log + N e d mins + log + S. 3 Finally in the high-snr limit we obtain eq.3 lim S 0.5 log+s = α + α]+ = max α α. Theorem shows that a mixed input can achieve the optimal same as the classical C gdof: exactly in very weak interference α 0 /] arbitrarily close in moderately weak interference α / /3] and very strong interference α > and up to an outage set similar to 5] in α /3 ]. This result especially for the strong and very strong interference regimes is unexpected. n the classical C which we use as an upper bound we know that i.i.d. Gaussian inputs with joint decoding of the intended and interfering message is optimal in strong and very strong interference. The lower bound R L S however seems to imply TN; if this interpretation of the mutual information expression of R L S were correct then we would conclude that 0 gdof are achievable if both users are active because the strong interference swamps the useful signal at each receiver and since it is Gaussian it is the worst find of noise; indeed i.i.d. Gaussian inputs give R L S = log + S + that corresponds to 0 gdof for S. Our work shows that by selecting non-gaussian inputs with more structure and by exploiting knowledge of this structure leads to unbounded gains gdof gains even when interference is treated as noise. V. FNTE SNR PERFORMANCE n this Section we refine the gdof result of Theorem by proving an additive gap between a sum-capacity upper bound in and the sum-rate achievable with a mixed input in Proposition 3. This improves the result of Section V and demonstrates that the classical C gdof are exactly achievable. Theorem 3. A mixed input as in Proposition 3 achieves the gap summarized in Table. Proof: We analyze the different regimes separately. We only consider the case S otherwise a trivial gap of bit can be achieved by silencing both users and otherwise a trivial gap of bit can be achieved by using Gaussian inputs and treating interference as noise i.e. δ = in 9. Very strong interference α : in in addition to setting log the parameters as in Table we further set ɛ = so that log + N e 3S N bit. The gap is the difference between R U in a and R L and is bounded as R U R L + S log + S ɛ e + log + a + S log + S ɛ e + log + 3 lns logs ] + b e S ɛ + log + ] c + e 3 lns + log + where: a x x for x b +x +x x ɛ for x ɛ and c definition of ɛ. Moderately weak interference regime / < α /3: the gap analysis is similar to the very strong interference regime but we now use the upper bound in b: we first notice that the difference between the upper bound in b and the last term in 3 can be upper bounded as log + S + log + S + + S + log therefore the gap analysis is the same as that leading to if we replace S with +S+ and we add an extra bit; we therefore conclude that in this regime the gap is R U R L 3 ln ] + e + log S + ] + S e 3 ln + log where the last inequality follows since S in this regime. Very weak interference α 0 /]: Gaussian inputs and treat interference as noise are optimal to within bit 0]. Remaining regime α /3 : the difference between the upper bound in c and the last term in 3 satisfies + S log+ + log + S log + S log + log thus the gap with N = + is R U R L e log logn + + log for 0 + log + N e d mins where the last term is not bounded for α = /3 or α = because at these points d min does not increase with S. We remedy this by slightly changing the number of points of the discrete part of the input to N = + ɛ and pick ɛ = log log ] + ln 3 minγ to ensures that log + N exp d mins /. With this we can show + ɛ R U R L log log log + log e ln ] + e 3 min γ log. 6
5 We note however in strong interference for ln 3 minγ S S ln and in moderate interference for 3 minγ 3 we have that log + N exp d mins / ; therefore ɛ is not needed and the gap is R U R L 5 + log e. V. APPLCATONS OF OUR RESULTS We have evaluated a very simple generally applicable lower bound to the capacity of any C with a mixture of discrete and Gaussian inputs and shown that through careful choice of the discrete input where the number of points may depend on the SNR that such an input may achieve to within an approximately constant gap of the capacity of a classical C. This result is of interest in several channels as outlined next. n 5] the authors studied the block asynchronous C whose sum-capacity is denoted by C AC-C. t was shown in 5] that R L S C AC-C R U S. Hence our results apply directly and imply that using mixed inputs of the type proposed here we may approximately in the sense of Theorem 3 achieve the capacity of the classical synchronous C. Another application is to the C with oblivious receivers introduced in ] where both receivers lack knowledge of the codebook of the interfering transmitter. Lack of codebook knowledge of the interfering signal prevents the decoders from using joint decoding of messages and successive interference cancellation which are known to be capacity achieving in a classical C where all codebooks are available to all nodes. Denote the sum-capacity of the oblivious C by C OB-C. ] demonstrated that R L S = C OB-C is indeed the capacity of the oblivious C however the input distribution that maximizes R L S was not found. Our results again directly apply and demonstrate the surprising fact that even if receivers do not know the interfering codebooks and cannot perform joint decoding of messages using a mixture of discrete and Gaussian inputs one can overcome this lack of codebook knowledge and approximately achieve the capacity of the C where all codebooks are known. V. CONCLUSON We studied the performance of mixed inputs on the Gaussian C. ts application to oblivious and asynchronous Cs somewhat surprisingly implies that much less global coordination between nodes is needed than one might expect: synchronism and codebook knowledge might not be critical if one is happy with approximate capacity results. We showed that TN is gdof optimal and within Olog logs of capacity. APPENDX Let S = h x X + h y Y and z Z. We want to find d mins := min i j { s i s j : s i s j S} with s i s j = h x x i + h y y i h x x j h y y j. Case x i = x j and y i y j or x i x j and y i = y j : s i s j = h y d min Y z i z j h y d min Y or s i s j h x d min X. Case x i x j and y i y j : s i s j = h y d min Y h x d min X h y d min Y z xi z xj z yj z yi. Next we lower bound hz z where h := hxdminx h yd miny. Let E = {h : hz z γ} for some γ > 0. The Lebesque measure of E c this will be our outage set is me c = m{h : γ + z < h < γ + z } = γ z z < γ. Hence on a set E we have the following lower bound s i s j h y d min Y γ. By symmetry s i s j h x d min X γ. Putting all the cases together we have: d mins min h y d min Y h x d min X min γ. Now assume X = Y = N and d miny = d minx. f S = N then there exists s i = h x x i + h y y i and s j = h x x j + h y y j such that s i = s j for some i j which in turn implies that x i y i x j y j such that h x h y = y j y i x i x j = z yi z yj d min Y z xi z xj d min X = z yi z yj z xi z xj 7 h x h y i.e. Q in order for S N. Since rationals have measure zero this implies that S = N a.e.. REFERENCES ] A. El Gamal and Y.-H. Kim Network nformation Theory. Camrbidge University Press 0. ] R. Ahlswede Multi-way communication channels in Proc. EEE nt. Symp. nf. Theory Tsahkadsor Mar. 973 pp ] R. Dobrushin General formulation of Shannon s main theorem in information theory American mathematical society translations: 9 papers on differential equations on information theory p ] O. Simeone E. Erkip and S. Shamai On codebook information for interference relay channels with out-of-band relaying EEE Trans. nf. Theory vol. 57 no. 5 pp May 0. 5] E. Calvo J. Fonollosa and J. Vidal On the totally asynchronous interference channel with single-user receivers in Proc. EEE nt. Symp. nf. Theory 009 pp ] X. Shang G. Kramer and B. Chen A new outer bound and the noisyinterference sum x03;rate capacity for gaussian interference channels EEE Trans. nf. Theory vol. 55 no. pp ] R. Cheng and S. Verdu On limiting characterizations of memoryless multiuser capacity regions EEE Trans. nf. Theory vol. 39 no. pp ] E. Abbe and L. Zheng A coordinate system for gaussian networks EEE Trans. nf. Theory vol. 58 no. pp ] A. Dytso D. Tuninetti and N. Devroye On discrete alphabets for the two-user gaussian interference channel with one receiver lacking knowledge of the interfering codebook 0] R. Etkin D. Tse and H. Wang Gaussian interference channel capacity to within one bit EEE Trans. nf. Theory vol. 5 no. pp Dec ] G. Kramer Outer bounds on the capacity of Gaussian interference channels EEE Trans. nf. Theory vol. 50 no. 3 Mar. 00. ] Y. Wu and S. Verdu The impact of constellation cardinality on gaussian channel capacity in Proc. Allerton Conf. Commun. Control and Comp. 00 pp ] A. Alvarado F. Brannstrom E. Agrell and T. Koch High-snr asymptotics of mutual information for discrete constellations with applications to bicm TransT vol. 60 no. pp ] L. Ozarow and A. Wyner On the capacity of the gaussian channel with a finite number of input levels nformation Theory EEE Transactions on vol. 36 no. 6 pp. 6 8 Nov ] O. Ordentlich U. Erez and B. Nazer The approximate sum capacity of the symmetric gaussian k-user interference channel in Proc. EEE nt. Symp. nf. Theory 0 pp z
i.i.d. Mixed Inputs and Treating Interference as Noise are gdof Optimal for the Symmetric Gaussian Two-user Interference Channel
iid Mixed nputs and Treating nterference as Noise are gdof Optimal for the Symmetric Gaussian Two-user nterference Channel Alex Dytso Daniela Tuninetti and Natasha Devroye University of llinois at Chicago
More informationOn the Optimality of Treating Interference as Noise in Competitive Scenarios
On the Optimality of Treating Interference as Noise in Competitive Scenarios A. DYTSO, D. TUNINETTI, N. DEVROYE WORK PARTIALLY FUNDED BY NSF UNDER AWARD 1017436 OUTLINE CHANNEL MODEL AND PAST WORK ADVANTAGES
More informationOn the K-user Cognitive Interference Channel with Cumulative Message Sharing Sum-Capacity
03 EEE nternational Symposium on nformation Theory On the K-user Cognitive nterference Channel with Cumulative Message Sharing Sum-Capacity Diana Maamari, Daniela Tuninetti and Natasha Devroye Department
More informationIEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 61, NO. 3, MARCH
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 6, NO., MARCH 05 57 On the Two-User Interference Channel With Lack of Knowledge of the Interference Codebook at One Receiver Alex Dytso, Daniela Tuninetti,
More informationThe Capacity Region of the Gaussian Cognitive Radio Channels at High SNR
The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract
More informationThe Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel
The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel Stefano Rini, Daniela Tuninetti, and Natasha Devroye Department
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationHetNets: what tools for analysis?
HetNets: what tools for analysis? Daniela Tuninetti (Ph.D.) Email: danielat@uic.edu Motivation Seven Ways that HetNets are a Cellular Paradigm Shift, by J. Andrews, IEEE Communications Magazine, March
More informationFeedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case
1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department
More informationApproximately achieving the feedback interference channel capacity with point-to-point codes
Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used
More informationOn the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications
On the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications Alex Dytso, Ronit Bustin, Daniela Tuninetti, Natasha Devroye, H. Vincent Poor, and Shlomo Shamai (Shitz) Notation X 2 R n,
More informationPrimary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel
Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Stefano Rini, Ernest Kurniawan and Andrea Goldsmith Technische Universität München, Munich, Germany, Stanford University,
More informationApproximate Sum-Capacity of Full- and Half-Duplex Asymmetric Interference Channels with Unilateral Source Cooperation
1 Approximate Sum-Capacity of Full- and Half-Duplex Asymmetric Interference Channels with Unilateral Source Cooperation Martina Cardone, Daniela Tuninetti, Raymond Knopp, Umer Salim Eurecom, Sophia Antipolis,
More informationInformation Theory Meets Game Theory on The Interference Channel
Information Theory Meets Game Theory on The Interference Channel Randall A. Berry Dept. of EECS Northwestern University e-mail: rberry@eecs.northwestern.edu David N. C. Tse Wireless Foundations University
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationInterference Alignment at Finite SNR for TI channels
Interference Alignment at Finite SNR for Time-Invariant channels Or Ordentlich Joint work with Uri Erez EE-Systems, Tel Aviv University ITW 20, Paraty Background and previous work The 2-user Gaussian interference
More informationOn the Secrecy Capacity of the Z-Interference Channel
On the Secrecy Capacity of the Z-Interference Channel Ronit Bustin Tel Aviv University Email: ronitbustin@post.tau.ac.il Mojtaba Vaezi Princeton University Email: mvaezi@princeton.edu Rafael F. Schaefer
More informationSurvey of Interference Channel
Survey of Interference Channel Alex Dytso Department of Electrical and Computer Engineering University of Illinois at Chicago, Chicago IL 60607, USA, Email: odytso2 @ uic.edu Abstract Interference Channel(IC)
More informationA Comparison of Two Achievable Rate Regions for the Interference Channel
A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg
More informationInterference Channel aided by an Infrastructure Relay
Interference Channel aided by an Infrastructure Relay Onur Sahin, Osvaldo Simeone, and Elza Erkip *Department of Electrical and Computer Engineering, Polytechnic Institute of New York University, Department
More informationJoint Source-Channel Coding for the Multiple-Access Relay Channel
Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il
More informationOn the Duality of Gaussian Multiple-Access and Broadcast Channels
On the Duality of Gaussian ultiple-access and Broadcast Channels Xiaowei Jin I. INTODUCTION Although T. Cover has been pointed out in [] that one would have expected a duality between the broadcast channel(bc)
More informationAn Outer Bound for the Gaussian. Interference channel with a relay.
An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il
More information820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY Stefano Rini, Daniela Tuninetti, and Natasha Devroye
820 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 2, FEBRUARY 2012 Inner and Outer Bounds for the Gaussian Cognitive Interference Channel and New Capacity Results Stefano Rini, Daniela Tuninetti,
More informationOn the Capacity Region of the Gaussian Z-channel
On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu
More informationOn the Applications of the Minimum Mean p th Error to Information Theoretic Quantities
On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities Alex Dytso, Ronit Bustin, Daniela Tuninetti, Natasha Devroye, H. Vincent Poor, and Shlomo Shamai (Shitz) Outline Notation
More informationL interférence dans les réseaux non filaires
L interférence dans les réseaux non filaires Du contrôle de puissance au codage et alignement Jean-Claude Belfiore Télécom ParisTech 7 mars 2013 Séminaire Comelec Parts Part 1 Part 2 Part 3 Part 4 Part
More informationThe Generalized Degrees of Freedom of the Interference Relay Channel with Strong Interference
The Generalized Degrees of Freedom of the Interference Relay Channel with Strong Interference Soheyl Gherekhloo, Anas Chaaban, and Aydin Sezgin Chair of Communication Systems RUB, Germany Email: {soheyl.gherekhloo,
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationAn Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel
Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and
More informationOn Network Interference Management
On Network Interference Management Aleksandar Jovičić, Hua Wang and Pramod Viswanath March 3, 2008 Abstract We study two building-block models of interference-limited wireless networks, motivated by the
More informationDispersion of the Gilbert-Elliott Channel
Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays
More informationCut-Set Bound and Dependence Balance Bound
Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems
More informationCoding over Interference Channels: An Information-Estimation View
Coding over Interference Channels: An Information-Estimation View Shlomo Shamai Department of Electrical Engineering Technion - Israel Institute of Technology Information Systems Laboratory Colloquium
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationCooperative Communication with Feedback via Stochastic Approximation
Cooperative Communication with Feedback via Stochastic Approximation Utsaw Kumar J Nicholas Laneman and Vijay Gupta Department of Electrical Engineering University of Notre Dame Email: {ukumar jnl vgupta}@ndedu
More informationStructured interference-mitigation in two-hop networks
tructured interference-mitigation in two-hop networks Yiwei ong Department of Electrical and Computer Eng University of Illinois at Chicago Chicago, IL, UA Email: ysong34@uicedu Natasha Devroye Department
More informationSum Capacity of General Deterministic Interference Channel with Channel Output Feedback
Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback Achaleshwar Sahai Department of ECE, Rice University, Houston, TX 775. as27@rice.edu Vaneet Aggarwal Department of
More informationGeneralized Writing on Dirty Paper
Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationEfficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel
Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca
More informationDecoupling of CDMA Multiuser Detection via the Replica Method
Decoupling of CDMA Multiuser Detection via the Replica Method Dongning Guo and Sergio Verdú Dept. of Electrical Engineering Princeton University Princeton, NJ 08544, USA email: {dguo,verdu}@princeton.edu
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More information5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010
5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora
More informationApproximate Capacity of Fast Fading Interference Channels with no CSIT
Approximate Capacity of Fast Fading Interference Channels with no CSIT Joyson Sebastian, Can Karakus, Suhas Diggavi Abstract We develop a characterization of fading models, which assigns a number called
More informationDegrees-of-Freedom Robust Transmission for the K-user Distributed Broadcast Channel
/33 Degrees-of-Freedom Robust Transmission for the K-user Distributed Broadcast Channel Presented by Paul de Kerret Joint work with Antonio Bazco, Nicolas Gresset, and David Gesbert ESIT 2017 in Madrid,
More informationInterference Channels with Source Cooperation
Interference Channels with Source Cooperation arxiv:95.319v1 [cs.it] 19 May 29 Vinod Prabhakaran and Pramod Viswanath Coordinated Science Laboratory University of Illinois, Urbana-Champaign Urbana, IL
More informationOn the Capacity of the Interference Channel with a Relay
On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due
More informationApproximate Ergodic Capacity of a Class of Fading Networks
Approximate Ergodic Capacity of a Class of Fading Networks Sang-Woon Jeon, Chien-Yi Wang, and Michael Gastpar School of Computer and Communication Sciences EPFL Lausanne, Switzerland {sangwoon.jeon, chien-yi.wang,
More informationOn Compound Channels With Side Information at the Transmitter
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 4, APRIL 2006 1745 On Compound Channels With Side Information at the Transmitter Patrick Mitran, Student Member, IEEE, Natasha Devroye, Student Member,
More informationOn the Capacity and Degrees of Freedom Regions of MIMO Interference Channels with Limited Receiver Cooperation
On the Capacity and Degrees of Freedom Regions of MIMO Interference Channels with Limited Receiver Cooperation Mehdi Ashraphijuo, Vaneet Aggarwal and Xiaodong Wang 1 arxiv:1308.3310v1 [cs.it] 15 Aug 2013
More informationA Summary of Multiple Access Channels
A Summary of Multiple Access Channels Wenyi Zhang February 24, 2003 Abstract In this summary we attempt to present a brief overview of the classical results on the information-theoretic aspects of multiple
More informationDiversity-Multiplexing Tradeoff of the Two-User Interference Channel
Diversity-Multiplexing Tradeoff of the Two-User Interference Channel arxiv:0905.0385v2 [cs.it] 8 Sep 2009 Adnan Raja and Pramod Viswanath April 9, 2018 Abstract Diversity-Multiplexing tradeoff DMT) is
More informationA Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels
A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu
More informationInformation Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results
Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel
More informationError Exponent Region for Gaussian Broadcast Channels
Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI
More informationarxiv: v1 [cs.it] 4 Jun 2018
State-Dependent Interference Channel with Correlated States 1 Yunhao Sun, 2 Ruchen Duan, 3 Yingbin Liang, 4 Shlomo Shamai (Shitz) 5 Abstract arxiv:180600937v1 [csit] 4 Jun 2018 This paper investigates
More informationConstellation Shaping for Communication Channels with Quantized Outputs
Constellation Shaping for Communication Channels with Quantized Outputs Chandana Nannapaneni, Matthew C. Valenti, and Xingyu Xiang Lane Department of Computer Science and Electrical Engineering West Virginia
More informationA Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying
A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,
More informationInterference Decoding for Deterministic Channels Bernd Bandemer, Student Member, IEEE, and Abbas El Gamal, Fellow, IEEE
2966 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 5, MAY 2011 Interference Decoding for Deterministic Channels Bernd Bemer, Student Member, IEEE, Abbas El Gamal, Fellow, IEEE Abstract An inner
More informationLecture 1: The Multiple Access Channel. Copyright G. Caire 12
Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user
More informationMultiuser Successive Refinement and Multiple Description Coding
Multiuser Successive Refinement and Multiple Description Coding Chao Tian Laboratory for Information and Communication Systems (LICOS) School of Computer and Communication Sciences EPFL Lausanne Switzerland
More informationOn the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels
On the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels Giuseppe Caire University of Southern California Los Angeles, CA, USA Email: caire@usc.edu Nihar
More informationK User Interference Channel with Backhaul
1 K User Interference Channel with Backhaul Cooperation: DoF vs. Backhaul Load Trade Off Borna Kananian,, Mohammad A. Maddah-Ali,, Babak H. Khalaj, Department of Electrical Engineering, Sharif University
More informationIN this paper, we consider the capacity of sticky channels, a
72 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 1, JANUARY 2008 Capacity Bounds for Sticky Channels Michael Mitzenmacher, Member, IEEE Abstract The capacity of sticky channels, a subclass of insertion
More informationSoft Covering with High Probability
Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information
More informationOn Gaussian MIMO Broadcast Channels with Common and Private Messages
On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu
More informationA Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding
A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation
More informationFeedback Capacity of a Class of Symmetric Finite-State Markov Channels
Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada
More informationStrong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach
Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University
More informationSecret Key Agreement Using Asymmetry in Channel State Knowledge
Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,
More informationInformation Dimension
Information Dimension Mina Karzand Massachusetts Institute of Technology November 16, 2011 1 / 26 2 / 26 Let X would be a real-valued random variable. For m N, the m point uniform quantized version of
More informationCapacity of Memoryless Channels and Block-Fading Channels With Designable Cardinality-Constrained Channel State Feedback
2038 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 9, SEPTEMBER 2004 Capacity of Memoryless Channels and Block-Fading Channels With Designable Cardinality-Constrained Channel State Feedback Vincent
More informationCognitive Multiple Access Networks
Cognitive Multiple Access Networks Natasha Devroye Email: ndevroye@deas.harvard.edu Patrick Mitran Email: mitran@deas.harvard.edu Vahid Tarokh Email: vahid@deas.harvard.edu Abstract A cognitive radio can
More informationBounds and Capacity Results for the Cognitive Z-interference Channel
Bounds and Capacity Results for the Cognitive Z-interference Channel Nan Liu nanliu@stanford.edu Ivana Marić ivanam@wsl.stanford.edu Andrea J. Goldsmith andrea@wsl.stanford.edu Shlomo Shamai (Shitz) Technion
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationMinimum Mean Squared Error Interference Alignment
Minimum Mean Squared Error Interference Alignment David A. Schmidt, Changxin Shi, Randall A. Berry, Michael L. Honig and Wolfgang Utschick Associate Institute for Signal Processing Technische Universität
More informationOn the Feedback Capacity of Stationary Gaussian Channels
On the Feedback Capacity of Stationary Gaussian Channels Young-Han Kim Information Systems Laboratory Stanford University Stanford, CA 94305-950 yhk@stanford.edu Abstract The capacity of stationary additive
More informationAppendix B Information theory from first principles
Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes
More informationOn the Shamai-Laroia Approximation for the Information Rate of the ISI Channel
On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel Yair Carmon and Shlomo Shamai (Shitz) Department of Electrical Engineering, Technion - Israel Institute of Technology 2014
More informationA digital interface for Gaussian relay and interference networks: Lifting codes from the discrete superposition model
A digital interface for Gaussian relay and interference networks: Lifting codes from the discrete superposition model M. Anand, Student Member, IEEE, and P. R. Kumar, Fellow, IEEE Abstract For every Gaussian
More informationA New Metaconverse and Outer Region for Finite-Blocklength MACs
A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region
More informationBounds on Mutual Information for Simple Codes Using Information Combining
ACCEPTED FOR PUBLICATION IN ANNALS OF TELECOMM., SPECIAL ISSUE 3RD INT. SYMP. TURBO CODES, 003. FINAL VERSION, AUGUST 004. Bounds on Mutual Information for Simple Codes Using Information Combining Ingmar
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationCapacity of a Two-way Function Multicast Channel
Capacity of a Two-way Function Multicast Channel 1 Seiyun Shin, Student Member, IEEE and Changho Suh, Member, IEEE Abstract We explore the role of interaction for the problem of reliable computation over
More informationMMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only
MMSE Dimension Yihong Wu Department of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú Department of Electrical Engineering Princeton University
More informationOptimal Power Allocation over Parallel Gaussian Broadcast Channels
Optimal Power Allocation over Parallel Gaussian Broadcast Channels David N.C. Tse Dept. of Electrical Engineering and Computer Sciences University of California at Berkeley email: dtse@eecs.berkeley.edu
More informationMinimum Repair Bandwidth for Exact Regeneration in Distributed Storage
1 Minimum Repair andwidth for Exact Regeneration in Distributed Storage Vivec R Cadambe, Syed A Jafar, Hamed Malei Electrical Engineering and Computer Science University of California Irvine, Irvine, California,
More informationEE 4TM4: Digital Communications II Scalar Gaussian Channel
EE 4TM4: Digital Communications II Scalar Gaussian Channel I. DIFFERENTIAL ENTROPY Let X be a continuous random variable with probability density function (pdf) f(x) (in short X f(x)). The differential
More informationAn Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets
An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets Jing Guo University of Cambridge jg582@cam.ac.uk Jossy Sayir University of Cambridge j.sayir@ieee.org Minghai Qin
More informationThe Capacity Region for Multi-source Multi-sink Network Coding
The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.
More informationLecture 6 Channel Coding over Continuous Channels
Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have
More informationAn Achievable Error Exponent for the Mismatched Multiple-Access Channel
An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of
More informationMismatched Multi-letter Successive Decoding for the Multiple-Access Channel
Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org
More informationANALYSIS OF A PARTIAL DECORRELATOR IN A MULTI-CELL DS/CDMA SYSTEM
ANAYSIS OF A PARTIA DECORREATOR IN A MUTI-CE DS/CDMA SYSTEM Mohammad Saquib ECE Department, SU Baton Rouge, A 70803-590 e-mail: saquib@winlab.rutgers.edu Roy Yates WINAB, Rutgers University Piscataway
More informationGaussian Relay Channel Capacity to Within a Fixed Number of Bits
Gaussian Relay Channel Capacity to Within a Fixed Number of Bits Woohyuk Chang, Sae-Young Chung, and Yong H. Lee arxiv:.565v [cs.it] 23 Nov 2 Abstract In this paper, we show that the capacity of the threenode
More informationVariable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationCapacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel
Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park
More information