Upper and Lower Bounds on the Capacity of Amplitude-Constrained MIMO Channels
|
|
- Jacob Hall
- 6 years ago
- Views:
Transcription
1 Upper and Lower Bounds on the Capacity of Amplitude-Constrained MIMO Channels Alex Dytso, Mario Goldenbaum, Shlomo Shamai Shitz, and H. Vincent Poor Department of Electrical Engineering, Princeton University Department of Electrical Engineering, Technion Israel Institute of Technology Abstract In this work, novel upper and lower bounds for the capacity of channels with arbitrary constraints on the support of the channel input symbols are derived. As an immediate practical application, the case of multiple-input multiple-output channels with amplitude constraints is considered. The bounds are shown to be within a constant gap if the channel matrix is invertible and are tight in the high amplitude regime for arbitrary channel matrices. Moreover, in the high amplitude regime, it is shown that the capacity scales linearly with the minimum between the number of transmit and receive antennas, similarly to the case of average power-constrained inputs. I. INTRODUCTION While the capacity of a multiple-input multiple-output MIMO channel with an average power constraint is well understood [], surprisingly, little is known about the capacity of the more practically relevant case in which the channel inputs are subject to amplitude constraints. The first major contribution to this problem was a seminal work of Smith [] in which it was shown that for the scalar Gaussian noise channel with an amplitude-constrained input the capacity achieving inputs are discrete with finite support. In [3], this result was extended to peak-power-limited quadrature Gaussian channels. Using the approach of [3], in [4] the optimal input distribution was shown to be discrete for MIMO channels with an identity channel matrix and a Euclidian norm constraint on the input vector. Even though the optimal input distribution is known to be discrete, very little is known about the number or the optimal positions of the corresponding constellation points. To the best of our knowledge, the only exception is the work of [5] in which for a scalar Gaussian noise channel it was shown that two point masses are optimal for amplitude values smaller than.67 and three for amplitude values of up to.786. Using a dual capacity expression, in [6] McKellips derived an upper bound on the capacity of a scalar amplitude-constrained channel that is asymptotically tight in the high amplitude regime. By using a clever choice of an auxiliary channel output distribution in the dual capacity expression, the authors of [7] sharpened McKellips bound and extended it to parallel MIMO channels with a Euclidian norm constraint on the input. The scalar version of the upper bound in [7] has been further sharpened in [8] by yet another choice of auxiliary output distribution. In [9], asymptotic lower and upper bounds for a MIMO system were presented and the gap between the bounds was specified. In this work, we make progress on this open problem by deriving several new upper and lower bounds that hold for channels with arbitrary constraints on the support of the input distribution. We then apply them to the special case of MIMO channels with amplitude-constrained inputs. This work was supported in part by the U. S. National Science Foundation under Grants CCF and CNS , by the German Research Foundation under Grant GO 669/-, and by the European Union s Horizon 00 Research And Innovation Programme, grant agreement no
2 A. Contributions and Paper Outline Our contributions and paper outline are as follows. The problem is stated in Section II. In Section III, we derive upper and lower bounds on the capacity of a MIMO channel with an arbitrary constraint on the support of the input. In Section IV, we evaluate the performance of our bounds by studying MIMO channels with invertible channel matrices. In particular, in Theorem 8 it is shown that our upper and lower bounds are within nlogρ bits, where ρ is the packing efficiency and n is the number of antennas. For diagonal channel matrices, it is shown in Theorem 9 that the Cartesian product of pulse-amplitude modulation PAM constellations achieves the capacity to within.64n bits. Section V is devoted to MIMO channels with arbitrary channel matrices. It is shown that in the high amplitude regime, similarly to the average power-constrained channel, the capacity scales linearly with the minimum of the number of transmit and receive antennas. Section VI concludes the paper. B. Notation Vectors are denoted by bold lowercase letters, random vectors by bold uppercase letters, and matrices by bold uppercase sans serif letters e.g., x, X, X. For any deterministic vector x R n, n N, we denote the Euclidian norm of x by x. For some X suppx R n and any p > 0 we define X p p := n E[ X p ], where suppx denotes the support of X. Note that for p, the quantity in defines a norm. The norm of a matrix H R n n is defined as Hx H := sup x:x 0 x. Let S be a subset of R n. Then, VolS := dx denotes its volume. Let R + := {x R : x 0}. We define an n-dimensional ball or radius r R + centered at x R n as the set B x r := {y : x y r}. Recall that for any x R n and r R +, Vol B x r π n = Γ. n +rn For any matrix H R k n and some S R n we define Note that for an invertible H R n n we have HS := {y : y = Hx, x S}. VolHS = deth VolS. We define the maximum and minimum radius of a set S R n that contains the origin as r max S := min{r R + : S B 0 r}, r min S := max{r R + : B 0 r S}. For a given vector a = a,...,a n R n + we define Boxa := {x R n : x i a i,i =,...,n} and the smallest box containing a given set S R n as BoxS := inf{boxa : S Boxa}, respectively. Finally, all logarithms are taken to the base, log + x := max{logx,0}, Qx, x R, denotes the Q-function, and δ x y the Kronecker delta, which is one for x = y and zero otherwise. S
3 3 II. PROBLEM STATEMENT Consider a MIMO system with n t N transmit and n r N receive antennas. The corresponding n r -dimensional channel output for a single channel use is of the form Y = HX+Z, for some fixed channel matrix H R nr nt. Here and hereafter, we assume Z N0,I nr is independent of the channel input X R nt and H is known to both the transmitter and the receiver, where I nr denotes the n r n r identity matrix. Now, in all that follows let X R nt be a convex and compact channel input space that contains the origin i.e., the length-n t zero vector and let F X denote the cumulative distribution function of X. As of the writing of this paper, the capacity CX,H := max IX;HX+Z F X :X X of this channel is unknown and we are interested in finding novel lower and upper bounds. Even though most of the results in this paper hold for arbitrary X, we are mainly interested in the two most important special cases: i per-antenna amplitude constraints; that is, X = Boxa for some given a = A,...,A nt R nt +, ii n t -dimensional amplitude constraint; that is, X = B 0 A for some given A R +. Remark. Note that determining the capacity of a MIMO channel with average per-antenna power constraints is also still an open problem and has been solved for some special cases only [0] []. A. Upper Bounds III. UPPER AND LOWER BOUNDS ON THE CAPACITY To establish our first upper bound on, we need the following result [3, Th. ]: Lemma. Maximum Entropy Under p-th Moment Constraint Let n N and p 0, be arbitrary. Then, for any U R n such that hu < and U p <, we have hu nlog k n,p n p U p, where k n,p := πe p p p Γ n n + n p Γ n + n Theorem. Moment Upper Bound For any channel input space X and any fixed channel matrix H, we have CX,H C M X,H := inf p>0 n rlog where x HX is chosen such that x = r max HX. k nr,p πe Proof: Expressing in terms of differential entropies results in. n p r x+z p, CX,H = max hhx+z hz F X :X X a max n k nr,p rlog n p F X :X X πe r HX+Z p b = n r log k nr,p πe Considering a real-valued channel model is without loss of generality. n p r max F X :X X HX+Z p, 3
4 4 where a follows from Lemma with the fact that hz = nr logπe and b from the monotonicity of the logarithm. Now, notice that HX+Z p is linear in F X and therefore it attains its maximum at an extreme point of the set F X := {F X : X X} i.e., the set of all cumulative distribution functions of X. As a matter of fact [4], the extreme points of F X are given by the set of degenerate distributions on X ; that is, {F X y = δ x y,y X} x X. This allows us to conclude max HX+Z p = max Hx+Z p. F X :X X x X Observe that the Euclidian norm is a convex function, which is therefore maximized at the boundary of the set HX. Combining this with 3 and taking the infimum over p > 0 completes the proof. The following theorem provides two alternative upper bounds that are based on duality arguments. Theorem. Duality Upper Bounds For any channel input space X and any fixed channel matrix H c nr d+ Vol B 0 d where and CX,H C Dual, X,H := log d := r max HX, c nr d := n r CX,H C n r Dual, X,H := where a = A,...,A nr such that Boxa = BoxHX. nr i πe nr Γ nr nr Γ n r d i,, 4 log + A i, 5 πe Proof: Using duality bounds, it has been shown in [7] that for any centered n-dimensional ball of radius r R + c n r+ Vol B 0 r where c n r := n Now, observe that n i max IX;X+Z log F X :X B 0 r Γ n n Γ n ri. πe n CX,H = max F X :X X hhx+z hhx+z HX = max F X :X X IHX;HX+Z, 6 = max I X; X+Z 7 F X: X HX a max F X: X B 0 d,d:=r maxhx b log I X; X+Z c nr d+ Vol B 0 d πe nr Here, a follows from enlarging the optimization domain and b from using the upper bound in 6. This proves 4..
5 5 In order to show the upper bound in 5, we proceed with an alternative upper bound to 7: CX,H = max I X; X+Z F X: X HX a b max I X; X+Z F X: X BoxHX n r max F X: X BoxHX c n r = d I X i ; X i +Z i max I X i ; X i +Z i F Xi : X i A i n r log + A i, πe where the inequalities follow from: a enlarging the optimization domain; b single-letterizing the mutual information; c choosing individual amplitude constraints A,...,A nr =: a R nr + such that Boxa = BoxHX; and d using the upper bound in 6 for n =. This concludes the proof. B. Lower Bounds A classical approach to bound a mutual information from below is to use the entropy power inequality EPI. Theorem 3. EPI Lower Bounds For any fixed channel matrix H and any channel input space X with X absolutely continuous, we have n r CX,H C EPI X,H := max + F X :X X log nr hhx. 8 πe Moreover, if n t = n r = n, H R n n is invertible, and X is uniformly distributed over X, then CX,H C EPI X,H := n log + deth nvolx n. 9 πe Proof: By means of the EPI we conclude nr hhx+z nr hhx + nr hz, max nr CX,H +πe nr hhx F X :X X, 0 which finalizes the proof of the lower bound in 8. To show the lower bound in 9, all we need is to recall that hhx = hx+log deth, which is maximized for X uniformly distributed over X. But if X is uniformly drawn from X, we have n hhx = VolHX n = deth n VolX n, which completes the proof. The results in [] [4] suggest that the channel input distribution that maximizes might be discrete. Therefore, there is a need for lower bounds that, unlike the bounds in Theorem 3, rely on discrete inputs.
6 6 Remark. We note that the problem of finding the optimal input distribution of a general MIMO channel with an amplitude constraint is still open. The technical difficulty relies on the fact that the identity theorem from complex analysis, a key tool in the method developed by Smith [] for the scalar case and later used by [5] for the MIMO channel, does not extend to R n and C n. The interested reader is referred to [6] for a detailed discussion on this issue with examples of why the identity theorem fails in the MIMO setting. Theorem 4. Ozarow-Wyner Type Lower Bound Let X D suppx D R nt be a discrete random vector of finite entropy, g : R nr R nt a measurable function, and p > 0. Furthermore, let K p be a set of continuous random vectors, independent of X D, such that for every U K p we have hu <, U p <, and suppu+x i suppu+x j = for all x i,x j suppx D, i j. Then, where with and k nt,p as defined in Lemma, respectively. CX,H C OW X,H := [HX D gap] +, gap := inf G,p U,X D,g+G,p U U K p g measurable p>0 U+XD gy p G,p U,X D,g := n t log, U p G,p U := n t log k n t,pn p t U p, 3 n t hu Proof: The proof is identical to [3, Th. ]. In order to make the manuscript more self-contained, we repeat it here. Let U and X D be statistically independent. Then, the mutual information IX D ;Y can be lower bounded as IX D ;Y a IX D +U;Y = hx D +U hx D +U Y b = HX D +hu hx D +U Y. 4 Here, a follows from the data processing inequality as X D +U X D Y forms a Markov chain in that order and b from the assumption in. By using Lemma, we have that the last term in 4 can be bounded from above as hx D +U Y n t log k nt,pn pt X D +U gy p. Combining this expression with 4 results in IX D ;Y HX D G,p U,X D,g+G,p U, with G,p and G,p as defined in and 3, respectively. Maximizing the right-hand side over all U K p, measurable functions g :R nr R nt, and p > 0 provides the bound. Interestingly, the bound of Theorem 4 holds for arbitrary channels and the interested reader is referred to [3] for details.
7 7 We conclude the section by providing a lower bound that is based on Jensen s inequality and holds for arbitrary inputs. Theorem 5. Jensen s Inequality Lower Bound For any channel input space X and fixed channel matrix H, we have nr [ ] CX,H C Jensen X,H := max F X :X X log+ E exp HX X, e 4 where X is an independent copy of X. Proof: In order to show the lower bound, we follow an approach of [7]. Note that by Jensen s inequality hy = E[logf Y Y] loge[f Y Y] = log f Y yf Y ydy. 5 R nr Now, evaluating the integral in 5 results in f Y yf Y ydy = R π nr nr a = π nre b = π nre c = [ R nr E [ E nr π nr [ [ R nr e e y HX e HX HX 4 e HX X 4 ] [ E e y HX y HX + y HX dy R nr e y ] ] dy ] HX X dy ], 6 where a follows from the independence of X and X and Tonelli s theorem, b from completing a square, and c from the fact that e y HX X R nr dy = e y dy = π nr R nr. Finally, combining 5 and 6, subtracting hz = nr logπe, and maximizing over F X proves the result. IV. INVERTIBLE CHANNEL MATRICES Consider the case of n t = n r = n antennas with H R n n being invertible. In this section, we evaluate some of the lower and upper bounds given in the previous section for the special case of H being also diagonal and then characterize the gap to the capacity for arbitrary invertible channel matrices. A. Diagonal Channel Matrices Suppose the channel inputs are subject to per-antenna or an n-dimensional amplitude constraint. Then, the duality upper bound C Dual, X,H of Theorem is of the following form. Theorem 6. Upper Bounds Let H = diagh,...,h nn R n n be fixed. If X = Boxa for some a = A,...,A n R n +, then n C Dual, Boxa,H = log + h ii A i. 7 πe Moreover, if X = B 0 A for some A R +, then C Dual, B 0 A,H = n log + h ii A. 8 n πe
8 8 Proof: The bound in 7 immediately follows from Theorem by observing that BoxHBoxa = BoxHa. The bound in 8 follows from Theorem by the fact that Box HB 0 A Box HBox B 0 A = Boxh, where h := A n h,..., h nn. This concludes the proof. For an arbitrary channel input space X, the EPI lower bound of Theorem 3 and Jensen s inequality lower bound of Theorem 5 evaluate to the following. Theorem 7. Lower Bounds Let H = diagh,...,h nn R n n be fixed and X arbitrary. Then, n C Jensen X,H = log +, 9 e ψh,b where with b := B,...,B n and ϕ : R + R +, ψh,b := min b X n ϕ h ii B i ϕx := x e x + πx Q x, 0 and C EPI X,H = n log +VolX n n h ii n. πe Proof: For some given values B i R +, i =,...,n, let the i-th component of X = X,...,X n be independent and uniformly distributed over the interval [ B i,b i ]. Thus, the expected value appearing in the bound of Theorem 5 can be written as [ E e HX X 4 ] [ = E n e h ii X i X i 4 ] [ n = E ] e h ii X i X i 4 = n [ E e h ii X i X i 4 ]. Now, if X is an independent copy of X, it can be shown that the expected value at the right-hand side of is of the explicit form [ ] E e h ii x i x i 4 = ϕ h ii B i with ϕ as defined in 0. Finally, optimizing over all b = B,...,B n X results in the bound 9. The bound in follows by inserting deth = n h ii into 9, which concludes the proof. In Fig., the upper bounds of Theorems and 6 and the lower bounds of Theorem 7 are depicted for a diagonal MIMO channel with per-antenna amplitude constraints. It turns out that the moment upper bound and the EPI lower bound perform well in the small amplitude regime while the duality upper bound and Jensen s inequality lower bound perform well in the high amplitude regime. B. Gap to the Capacity Our first result bounds the gap between the capacity and the lower bound in 9. Theorem 8. Let H R n n be of full rank and Then, ρx,h := Vol B 0 r max HX VolHX CX,H C EPI X,H n log πn n ρx,h n..
9 9 Rate in bits/s/hz Moment upper bound, CM Duality upper bound, CDual, Jensen s lower bound, C Jensen EPI lower bound, C EPI Amplitude Constraint, A, in db Fig.. Comparison of the upper and lower bounds of Theorems, 6, and 7 evaluated for a MIMO system with per-antenna amplitude constraints A = A = A i.e., a = A,A and channel matrix H = as Proof: For notational convenience let the volume of an n-dimensional ball of radius r > 0 be denoted V n r := Vol B 0 r = V n r n = π n r n Γ n +. Now, observe that by choosing p =, the upper bound of Theorem can further be upper bounded as k n, C M X,H nlog n πe x+z a = n log n E[ x+z ] b = n log + n x, πe where a follows since k n, = and b since n E[ Z ] = n. Therefore, the gap between 9 and the moment upper bound of Theorem can be upper bounded as follows: C M X,H C EPI X,H = n log + n x a = n log = n log + VolHX n πe + Vn x n n V n + VolHX n πe + n b n log n πe ρx,hvolhx V n + VolHX n πe ρx,h V n n n
10 0 A A X Fig.. Example of a pulse-amplitude modulation constellation with N = 4 points and amplitude constraint A i.e., PAM4, A, where := A/N denotes half the Euclidean distance between two adjacent constellation points. In case N is odd, 0 is a constellation point. c n log πn n ρx,h n Here, a is due to the fact that x is the radius of an n-dimensional ball, b follows from the inequality +cx c for c and x R n +x +, and c follows from using Stirling s approximation to obtain V n n + eπ n n. The term ρx,h is referred to as the packing efficiency of the set HX. In the following proposition, we present the packing efficiencies for important special cases. Proposition. Packing Efficiencies Let H R n n be of full rank, A R +, and a := A,...,A n R n +. Then, ρ B 0 A,I n =, 3. ρ B 0 A,H = H n deth, 4 ρ π n Boxa,I n = Γ a n n n + A, 5 i ρ Boxa,H Γ n π n H n a n + deth n A i Proof: The packing efficiency 3 follows immediately. Note that r max HB0 A = max Hx = H A. x B 0 A. 6 Thus, as H is assumed to be invertible we have VolHB 0 A = deth VolB 0 A, which results in 4. To show 5, observe that Vol B 0 rmax In Boxa = Vol B 0 a = π n Γ n + a n. The proof of 5 is concluded by observing thatvoli n Boxa = n A i. Finally, observe thatboxa B 0 a implies r max HBoxa r max HB 0 a so that ρ H,Boxa Vol B 0 H a n Vol HBoxa = π Γ H n a n n + deth n A, i which is the bound in 6. We conclude this section by characterizing the gap to the capacity when H is diagonal and the channel input space is the Cartesian product of n PAM constellations. In this context, PAMN,A refers to the set of N N equidistant PAM-constellation points with amplitude constraint A R + see Fig. for an illustration, whereas X PAMN, A means that X is uniformly distributed over PAMN, A [3].
11 Theorem 9. Let H = diagh,...,h nn R n n be fixed and X = X,...,X n. Then, if X i PAMN i,a i, i =,...,n, for some given a = A,...,A n R n +, it holds that where N i := + A i h ii πe and C Dual, Boxa,H C OW Boxa,H c n bits, 7 c := log+ πe + log 6 log + 6 πe.64. Moreover, if X i PAMN i,a, i =,...,n, for some given A R +, it holds that C Dual, B 0 A,H C OW B 0 A,H c n bits, 8 where N i := + A h ii n πe. Proof: Since the channel matrix is diagonal, letting the channel input X be such that its elements X i, i =,...,n, are independent we have that n IX;HX+Z = IX i ;h ii X i +Z i. Let X i PAMN i,a i with N i := + A i h ii πe and observe that half the Euclidean distance between any pair of adjacent points in PAMN i,a i is equal to i := A i /N i see Fig., i =,...,n. In order to lower bound the mutual information IX i ;h ii X i +Z i, we use the bound of Theorem 4 for p = and n t =. Thus, for some continuous random variable U that is uniformly distributed over the interval [ i, i and independent of X i we have that IX i ;h ii X i +Z i HX i log πe 6 [ E U log +Xi gy i ]. 9 E[U ] Now, note that the entropy term in 9 can be lower bounded as HX i = log + A i h ii log + A i h ii +log, 30 πe πe where we have used that x x for every x. On the other hand, the last term in 9 can be upper bounded by upper bounding its argument as follows: E [ U +X i gy i ] a = + 3E[ X i gy i ] E[U ] i b + 3E[Z i ]N i A i h ii = + 3N i A i h ii c 3 Ai h ii πe + A i h ii = + 6 πe. 3 Here, a follows from using that X i and U are independent and E[U ] = i, b from using the estimator 3 gy i = h ii Y i, and c from N i = + A i h ii πe + A i h ii πe. Combining 9, 30, and 3 results in the gap 7. The proof of the gap in 8 follows along similar lines, which concludes the proof.
12 V. ARBITRARY CHANNEL MATRICES For a MIMO channel with an arbitrary channel matrix and an average power constraint, the capacity is achieved by a singular value decomposition SVD of the channel matrix i.e., H = UΛV T and considering the equivalent channel model Ỹ = Λ X+ Z, where Ỹ := UT Y, X := V T X, and Z := U T Z, respectively. To provide lower bounds for channels with amplitude constraints and SVD precoding, we need the following lemma. Lemma. For any given orthogonal matrix V R nt nt and constraint vector a = A,...,A nt R nt + there exists a distribution F X of X such that X = V T X is uniformly distributed over Boxa. Moreover, the components X,..., X nt of X are mutually independent with X i uniformly distributed over [ A i,a i ], i =,...,n t. Proof: Suppose that X is uniformly distributed over Boxa; that is, the density of X is of the form f X x = Vol, x Boxa. Boxa Since V is orthogonal, we have V X = X and by the change of variable theorem for x VBoxa f X x = detv f XV T x = Therefore, such a distribution of X exists. detv Vol Boxa = Vol Boxa Theorem 0. Lower Bounds with SVD Precoding Let H R nr nt be fixed, n min := minn r,n t, and X = Boxa for some a = A,...,A nt R nt +. Furthermore, let σ i, i =,...,n min, be the i-th singular value of H. Then, n min C Jensen Boxa,H = log + 3 e ψh,b and C EPI Boxa,H = n min log + n min A iσ i πe n min, 33 where with b := B,...,B nt and ϕ as defined in 0. n min ψh,b := min ϕσ i B i b Boxa Proof: Performing the SVD, the expected value in Theorem 5 can be written as [ ] [ ] [ ] [ E e HX X 4 = E e UΛVT X X 4 = E e ΛVT X X 4 = E e Λ X X 4 By Lemma there exists a distribution F X such that the components of X are independent and uniformly distributed. Since Λ is a diagonal matrix, we can use Theorem 7 to arrive at 3. ].
13 3 Rate in bits/s/hz 3 Moment upper bound Jensen s lower bound + SVD EPI lower bound + SVD Amplitude Constraint, A, in db Fig. 3. Comparison of the upper bound in Theorem to the lower bounds of Theorem 0 for a 3 MIMO system with amplitude constraints A = A = A 3 = A i.e., a = A,A,A and channel matrix H = ,0.0357, Note that by Lemma there exists a distribution on X such that X is uniform over Boxa R nt and Λ X is uniform over ΛBoxa R n min, respectively. Therefore, by the EPI lower bound given in 8 we obtain + n min hλ X C EPI Boxa,H = n min log πe = n min log + Vol ΛBoxa n min πe = n min log + n min A i n min n min σ i πe which is exactly the expression in 33. This concludes the proof. Remark 3. Notice that choosing the optimal b for the lower bound 3 is an amplitude allocation problem, which is reminiscent of waterfilling in the average power constraint case. It would be interesting to study whether the bound in 3 is connected to what is called mercury waterfilling in [8], [9]. In Fig. 3, the lower bounds of Theorem 0 are compared to the moment upper bound of Theorem for the special case of a 3 MIMO channel. Similarly to the example presented in Fig., the EPI lower bound performs well in the low amplitude regime while Jensen s inequality lower bound performs well in the high amplitude regime. We conclude this section by showing that for an arbitrary channel input space X, in the large amplitude regime the capacity pre-log is given by minn r,n t. Theorem. Let X be arbitrary and H R nr nt fixed. Then, lim r min X CX,H = minn r,n t. log + r minx πe Proof: Notice that there always exists a R nt + and c R + such that Boxa X cboxa. Thus, without loss generality we can consider X = Boxa, a = A,...,A, for sufficiently large A R +. To n min,
14 4 prove the result we therefore start with enlarging the constraint set of the bound in 5: Box HBoxa B 0 rmax HBoxa B 0 rmax HB0 n t A = B 0 rmax UΛV T B 0 n t A = B 0 rmax UΛB0 n t A = B 0 rmax ΛB0 n t A where r := n t A nmin σ i and a := r in 5 it follows that Moreover, n r CBoxa, H lim A B 0 r Boxa, nmin,..., log + A i n min log + πe CBoxa, H n min lim log + A πe A Next, using the EPI lower bound in 33, we have that lim A This concludes the proof. C Boxa,Λ EPI = n min lim log + A πe A r nmin R n min +. Therefore, by using the upper bound nt A nmin σ i. πe nmin log + nmin nta σ i πe nmin = n min. log + A πe log + A n min σ i πe n min = n min. log + A πe VI. CONCLUSION In this work, we have focused on studying the capacity of MIMO systems with bounded channel input spaces. Several new upper and lower bounds have been proposed and it has been shown that the lower and upper bounds are tight in the high amplitude regime. An interesting direction for future work is to determine the exact scaling in the massive MIMO regime i.e., n min. Another interesting future direction is to study generalizations of our techniques to MIMO wireless optical channels [0]. REFERENCES [] I. E. Telatar, Capacity of multi-antenna Gaussian channels, Europ. Trans. Telecommu., vol. 0, no. 6, pp , Nov./Dec [] J. G. Smith, The information capacity of amplitude- and variance-constrained scalar Gaussian channels, Inf. Control, vol. 8, no. 3, pp. 03 9, Apr. 97. [3] S. Shamai Shitz and I. Bar-David, The capacity of average and peak-power-limited quadrature Gaussian channels, IEEE Trans. Inf. Theory, vol. 4, no. 4, pp , Jul [4] B. Rassouli and B. Clerckx, On the capacity of vector Gaussian channels with bounded inputs, IEEE Trans. Inf. Theory, vol. 6, no., pp , Dec. 06. [5] N. Sharma and S. Shamai Shitz, Transition points in the capacity-achieving distribution for the peak-power limited AWGN and free-space optical intensity channels, Probl. Inf. Transm., vol. 46, no. 4, pp , 00. [6] A. L. McKellips, Simple tight bounds on capacity for the peak-limited discrete-time channel, in Proc. IEEE Int. Symp. Inf. Theory ISIT, Chicago, IL, Jul. 004, pp [7] A. Thangaraj, G. Kramer, and G. Böcherer. 05 Capacity bounds for discrete-time, amplitude-constrained, additive white Gaussian noise channels. [Online]. Available:
15 [8] B. Rassouli and B. Clerckx, An upper bound for the capacity of amplitude-constrained scalar AWGN channel, IEEE Commun. Lett., vol. 0, no. 0, pp , Oct. 06. [9] A. ElMoslimany and T. M. Duman, On the capacity of multiple-antenna systems and parallel Gaussian channels with amplitude-limited inputs, IEEE Trans. Commun., vol. 64, no. 7, pp , Jul. 06. [0] M. Vu, MISO capacity with per-antenna power constraint, IEEE Trans. Commun., vol. 59, no. 5, pp , May 0. [] D. Tuninetti, On the capacity of the AWGN MIMO channel under per-antenna power constraints, in Proc. IEEE Int. Conf. Commun. ICC, Sydney, Australia, Jun. 04, pp [] S. Loyka, The capacity of Gaussian MIMO channels under total and per-antenna power constraints, IEEE Trans. Commun., vol. 65, no. 3, pp , Mar. 07. [3] A. Dytso, M. Goldenbaum, H. V. Poor, and S. Shamai Shitz, A generalized Ozarow-Wyner capacity bound with applications, in Proc. IEEE Int. Symp. Inf. Theory ISIT, Aachen, Germany, Jun. 07, to appear. [4] H. S. Witsenhausen, Some aspects of convexity useful in information theory, IEEE Trans. Inf. Theory, vol. 6, no. 3, pp. 65 7, May 980. [5] T. H. Chan, S. Hranilovic, and F. R. Kschischang, Capacity-achieving probability measure for conditionally Gaussian channels with bounded inputs, IEEE Trans. Inf. Theory, vol. 5, no. 6, pp , Mar [6] J. Sommerfeld, I. Bjelaković, and H. Boche, On the boundedness of the support of optimal input measures for Rayleigh fading channels, in Proc. IEEE Int. Symp. Inf. Theory ISIT, Toronto, Canada, Jul. 008, pp. 08. [7] A. Dytso, D. Tuninetti, and N. Devroye, Interference as noise: Friend or foe? IEEE Trans. Inf. Theory, vol. 6, no. 6, pp , Jun. 06. [8] A. Lozano, A. M. Tulino, and S. Verdú, Optimum power allocation for parallel Gaussian channels with arbitrary input distributions, IEEE Trans. Inf. Theory, vol. 5, no. 7, pp , Jul [9] F. Pérez-Cruz, M. R. Rodrigues, and S. Verdú, MIMO Gaussian channels with arbitrary inputs: Optimal precoding and power allocation, IEEE Trans. Inf. Theory, vol. 56, no. 3, pp , Mar. 00. [0] S. M. Moser, M. Mylonakis, L. Wang, and M. Wigger. 07 Capacity results for MIMO wireless optical communication. Submitted to IEEE Int. Symp. Inf. Theory ISIT. [Online]. Available: pdf 5
On the Capacity of Free-Space Optical Intensity Channels
On the Capacity of Free-Space Optical Intensity Channels Amos Lapidoth TH Zurich Zurich, Switzerl mail: lapidoth@isi.ee.ethz.ch Stefan M. Moser National Chiao Tung University NCTU Hsinchu, Taiwan mail:
More informationOn the Applications of the Minimum Mean p th Error to Information Theoretic Quantities
On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities Alex Dytso, Ronit Bustin, Daniela Tuninetti, Natasha Devroye, H. Vincent Poor, and Shlomo Shamai (Shitz) Outline Notation
More informationOn the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications
On the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications Alex Dytso, Ronit Bustin, Daniela Tuninetti, Natasha Devroye, H. Vincent Poor, and Shlomo Shamai (Shitz) Notation X 2 R n,
More informationOn the Optimality of Treating Interference as Noise in Competitive Scenarios
On the Optimality of Treating Interference as Noise in Competitive Scenarios A. DYTSO, D. TUNINETTI, N. DEVROYE WORK PARTIALLY FUNDED BY NSF UNDER AWARD 1017436 OUTLINE CHANNEL MODEL AND PAST WORK ADVANTAGES
More informationShannon meets Wiener II: On MMSE estimation in successive decoding schemes
Shannon meets Wiener II: On MMSE estimation in successive decoding schemes G. David Forney, Jr. MIT Cambridge, MA 0239 USA forneyd@comcast.net Abstract We continue to discuss why MMSE estimation arises
More informationOptimum Power Allocation in Fading MIMO Multiple Access Channels with Partial CSI at the Transmitters
Optimum Power Allocation in Fading MIMO Multiple Access Channels with Partial CSI at the Transmitters Alkan Soysal Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland,
More informationCapacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel
Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park
More informationWE study the capacity of peak-power limited, single-antenna,
1158 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 3, MARCH 2010 Gaussian Fading Is the Worst Fading Tobias Koch, Member, IEEE, and Amos Lapidoth, Fellow, IEEE Abstract The capacity of peak-power
More informationErgodic and Outage Capacity of Narrowband MIMO Gaussian Channels
Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels Yang Wen Liang Department of Electrical and Computer Engineering The University of British Columbia April 19th, 005 Outline of Presentation
More informationOn the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels
On the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels Giuseppe Caire University of Southern California Los Angeles, CA, USA Email: caire@usc.edu Nihar
More informationSecure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel
Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 074 pritamm@umd.edu
More informationMultiuser Capacity in Block Fading Channel
Multiuser Capacity in Block Fading Channel April 2003 1 Introduction and Model We use a block-fading model, with coherence interval T where M independent users simultaneously transmit to a single receiver
More informationParallel Additive Gaussian Channels
Parallel Additive Gaussian Channels Let us assume that we have N parallel one-dimensional channels disturbed by noise sources with variances σ 2,,σ 2 N. N 0,σ 2 x x N N 0,σ 2 N y y N Energy Constraint:
More informationWE consider a memoryless discrete-time channel whose
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 1, JANUARY 2009 303 On the Capacity of the Discrete-Time Poisson Channel Amos Lapidoth, Fellow, IEEE, and Stefan M. Moser, Member, IEEE Abstract The
More informationAppendix B Information theory from first principles
Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes
More informationOn Gaussian MIMO Broadcast Channels with Common and Private Messages
On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu
More informationEnergy State Amplification in an Energy Harvesting Communication System
Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu
More informationMultiple Antennas in Wireless Communications
Multiple Antennas in Wireless Communications Luca Sanguinetti Department of Information Engineering Pisa University luca.sanguinetti@iet.unipi.it April, 2009 Luca Sanguinetti (IET) MIMO April, 2009 1 /
More informationA Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources
A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources Wei Kang Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College
More informationThe Capacity Region of the Gaussian Cognitive Radio Channels at High SNR
The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract
More informationarxiv: v1 [cs.it] 4 Jun 2018
State-Dependent Interference Channel with Correlated States 1 Yunhao Sun, 2 Ruchen Duan, 3 Yingbin Liang, 4 Shlomo Shamai (Shitz) 5 Abstract arxiv:180600937v1 [csit] 4 Jun 2018 This paper investigates
More informationDispersion of the Gilbert-Elliott Channel
Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays
More informationOn the Optimality of Multiuser Zero-Forcing Precoding in MIMO Broadcast Channels
On the Optimality of Multiuser Zero-Forcing Precoding in MIMO Broadcast Channels Saeed Kaviani and Witold A. Krzymień University of Alberta / TRLabs, Edmonton, Alberta, Canada T6G 2V4 E-mail: {saeed,wa}@ece.ualberta.ca
More informationUpper Bounds on MIMO Channel Capacity with Channel Frobenius Norm Constraints
Upper Bounds on IO Channel Capacity with Channel Frobenius Norm Constraints Zukang Shen, Jeffrey G. Andrews, Brian L. Evans Wireless Networking Communications Group Department of Electrical Computer Engineering
More informationOn Gaussian Interference Channels with Mixed Gaussian and Discrete Inputs
On Gaussian nterference Channels with Mixed Gaussian and Discrete nputs Alex Dytso Natasha Devroye and Daniela Tuninetti University of llinois at Chicago Chicago L 60607 USA Email: odytso devroye danielat
More informationOn the Secrecy Capacity of the Z-Interference Channel
On the Secrecy Capacity of the Z-Interference Channel Ronit Bustin Tel Aviv University Email: ronitbustin@post.tau.ac.il Mojtaba Vaezi Princeton University Email: mvaezi@princeton.edu Rafael F. Schaefer
More informationUSING multiple antennas has been shown to increase the
IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 55, NO. 1, JANUARY 2007 11 A Comparison of Time-Sharing, DPC, and Beamforming for MIMO Broadcast Channels With Many Users Masoud Sharif, Member, IEEE, and Babak
More informationCapacity Pre-Log of SIMO Correlated Block-Fading Channels
Capacity Pre-Log of SIMO Correlated Block-Fading Channels Wei Yang, Giuseppe Durisi, Veniamin I. Morgenshtern, Erwin Riegler 3 Chalmers University of Technology, 496 Gothenburg, Sweden ETH Zurich, 809
More informationLecture 5: Antenna Diversity and MIMO Capacity Theoretical Foundations of Wireless Communications 1. Overview. CommTh/EES/KTH
: Antenna Diversity and Theoretical Foundations of Wireless Communications Wednesday, May 4, 206 9:00-2:00, Conference Room SIP Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless Communication
More informationGeneralized Writing on Dirty Paper
Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland
More informationTitle. Author(s)Tsai, Shang-Ho. Issue Date Doc URL. Type. Note. File Information. Equal Gain Beamforming in Rayleigh Fading Channels
Title Equal Gain Beamforming in Rayleigh Fading Channels Author(s)Tsai, Shang-Ho Proceedings : APSIPA ASC 29 : Asia-Pacific Signal Citationand Conference: 688-691 Issue Date 29-1-4 Doc URL http://hdl.handle.net/2115/39789
More informationto be almost surely min{n 0, 3
1 The DoF Region of the Three-Receiver MIMO Broadcast Channel with Side Information and Its Relation to Index Coding Capacity Behzad Asadi, Lawrence Ong, and Sarah J Johnson arxiv:160803377v1 [csit] 11
More informationDirty Paper Coding vs. TDMA for MIMO Broadcast Channels
TO APPEAR IEEE INTERNATIONAL CONFERENCE ON COUNICATIONS, JUNE 004 1 Dirty Paper Coding vs. TDA for IO Broadcast Channels Nihar Jindal & Andrea Goldsmith Dept. of Electrical Engineering, Stanford University
More informationELEC E7210: Communication Theory. Lecture 10: MIMO systems
ELEC E7210: Communication Theory Lecture 10: MIMO systems Matrix Definitions, Operations, and Properties (1) NxM matrix a rectangular array of elements a A. an 11 1....... a a 1M. NM B D C E ermitian transpose
More informationLecture 2. Capacity of the Gaussian channel
Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN
More informationSingle-User MIMO systems: Introduction, capacity results, and MIMO beamforming
Single-User MIMO systems: Introduction, capacity results, and MIMO beamforming Master Universitario en Ingeniería de Telecomunicación I. Santamaría Universidad de Cantabria Contents Introduction Multiplexing,
More informationDegrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages
Degrees of Freedom Region of the Gaussian MIMO Broadcast hannel with ommon and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and omputer Engineering University of Maryland, ollege
More informationSum-Rate Capacity of Poisson MIMO Multiple-Access Channels
1 Sum-Rate Capacity of Poisson MIMO Multiple-Access Channels Ain-ul-Aisha, Lifeng Lai, Yingbin Liang and Shlomo Shamai (Shitz Abstract In this paper, we analyze the sum-rate capacity of two-user Poisson
More informationThreshold Optimization for Capacity-Achieving Discrete Input One-Bit Output Quantization
Threshold Optimization for Capacity-Achieving Discrete Input One-Bit Output Quantization Rudolf Mathar Inst. for Theoretical Information Technology RWTH Aachen University D-556 Aachen, Germany mathar@ti.rwth-aachen.de
More informationi.i.d. Mixed Inputs and Treating Interference as Noise are gdof Optimal for the Symmetric Gaussian Two-user Interference Channel
iid Mixed nputs and Treating nterference as Noise are gdof Optimal for the Symmetric Gaussian Two-user nterference Channel Alex Dytso Daniela Tuninetti and Natasha Devroye University of llinois at Chicago
More informationConstellation Shaping for Communication Channels with Quantized Outputs
Constellation Shaping for Communication Channels with Quantized Outputs Chandana Nannapaneni, Matthew C. Valenti, and Xingyu Xiang Lane Department of Computer Science and Electrical Engineering West Virginia
More informationCapacity of Block Rayleigh Fading Channels Without CSI
Capacity of Block Rayleigh Fading Channels Without CSI Mainak Chowdhury and Andrea Goldsmith, Fellow, IEEE Department of Electrical Engineering, Stanford University, USA Email: mainakch@stanford.edu, andrea@wsl.stanford.edu
More informationLecture 7 MIMO Communica2ons
Wireless Communications Lecture 7 MIMO Communica2ons Prof. Chun-Hung Liu Dept. of Electrical and Computer Engineering National Chiao Tung University Fall 2014 1 Outline MIMO Communications (Chapter 10
More informationA Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels
A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu
More informationDuality, Achievable Rates, and Sum-Rate Capacity of Gaussian MIMO Broadcast Channels
2658 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 49, NO 10, OCTOBER 2003 Duality, Achievable Rates, and Sum-Rate Capacity of Gaussian MIMO Broadcast Channels Sriram Vishwanath, Student Member, IEEE, Nihar
More informationThe Capacity Region of the Gaussian MIMO Broadcast Channel
0-0 The Capacity Region of the Gaussian MIMO Broadcast Channel Hanan Weingarten, Yossef Steinberg and Shlomo Shamai (Shitz) Outline Problem statement Background and preliminaries Capacity region of the
More informationJoint FEC Encoder and Linear Precoder Design for MIMO Systems with Antenna Correlation
Joint FEC Encoder and Linear Precoder Design for MIMO Systems with Antenna Correlation Chongbin Xu, Peng Wang, Zhonghao Zhang, and Li Ping City University of Hong Kong 1 Outline Background Mutual Information
More informationOptimal Power Control in Decentralized Gaussian Multiple Access Channels
1 Optimal Power Control in Decentralized Gaussian Multiple Access Channels Kamal Singh Department of Electrical Engineering Indian Institute of Technology Bombay. arxiv:1711.08272v1 [eess.sp] 21 Nov 2017
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationFeedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case
1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department
More informationNOMA: An Information Theoretic Perspective
NOMA: An Information Theoretic Perspective Peng Xu, Zhiguo Ding, Member, IEEE, Xuchu Dai and H. Vincent Poor, Fellow, IEEE arxiv:54.775v2 cs.it] 2 May 25 Abstract In this letter, the performance of non-orthogonal
More informationCapacity Bounds for Power- and Band-Limited Wireless Infrared Channels Corrupted by Gaussian Noise
40th Annual Allerton Conference on Communications, Control, and Computing, U.Illinois at Urbana-Champaign, Oct. 2-4, 2002, Allerton House, Monticello, USA. Capacity Bounds for Power- and Band-Limited Wireless
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationEstimation of the Capacity of Multipath Infrared Channels
Estimation of the Capacity of Multipath Infrared Channels Jeffrey B. Carruthers Department of Electrical and Computer Engineering Boston University jbc@bu.edu Sachin Padma Department of Electrical and
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationIEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 2, FEBRUARY Uplink Downlink Duality Via Minimax Duality. Wei Yu, Member, IEEE (1) (2)
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 2, FEBRUARY 2006 361 Uplink Downlink Duality Via Minimax Duality Wei Yu, Member, IEEE Abstract The sum capacity of a Gaussian vector broadcast channel
More informationCapacity of the Discrete-Time AWGN Channel Under Output Quantization
Capacity of the Discrete-Time AWGN Channel Under Output Quantization Jaspreet Singh, Onkar Dabeer and Upamanyu Madhow Abstract We investigate the limits of communication over the discrete-time Additive
More informationVector Channel Capacity with Quantized Feedback
Vector Channel Capacity with Quantized Feedback Sudhir Srinivasa and Syed Ali Jafar Electrical Engineering and Computer Science University of California Irvine, Irvine, CA 9697-65 Email: syed@ece.uci.edu,
More informationOptimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels
Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Bike ie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract Certain degraded broadcast channels
More informationStability and Sensitivity of the Capacity in Continuous Channels. Malcolm Egan
Stability and Sensitivity of the Capacity in Continuous Channels Malcolm Egan Univ. Lyon, INSA Lyon, INRIA 2019 European School of Information Theory April 18, 2019 1 / 40 Capacity of Additive Noise Models
More informationOn the Shamai-Laroia Approximation for the Information Rate of the ISI Channel
On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel Yair Carmon and Shlomo Shamai (Shitz) Department of Electrical Engineering, Technion - Israel Institute of Technology 2014
More informationThe Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel
The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel Stefano Rini, Daniela Tuninetti, and Natasha Devroye Department
More informationLecture 6 Channel Coding over Continuous Channels
Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have
More informationJoint Multi-Cell Processing for Downlink Channels with Limited-Capacity Backhaul
Joint Multi-Cell Processing for Downlink Channels with Limited-Capacity Backhaul Shlomo Shamai (Shitz) Department of Electrical Engineering Technion Haifa, 3000, Israel sshlomo@ee.technion.ac.il Osvaldo
More informationOptimal Data and Training Symbol Ratio for Communication over Uncertain Channels
Optimal Data and Training Symbol Ratio for Communication over Uncertain Channels Ather Gattami Ericsson Research Stockholm, Sweden Email: athergattami@ericssoncom arxiv:50502997v [csit] 2 May 205 Abstract
More informationLecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH
: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1 Rayleigh Wednesday, June 1, 2016 09:15-12:00, SIP 1 Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless Communication
More informationCHANNEL FEEDBACK QUANTIZATION METHODS FOR MISO AND MIMO SYSTEMS
CHANNEL FEEDBACK QUANTIZATION METHODS FOR MISO AND MIMO SYSTEMS June Chul Roh and Bhaskar D Rao Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA 9293 47,
More informationSchur-convexity of the Symbol Error Rate in Correlated MIMO Systems with Precoding and Space-time Coding
Schur-convexity of the Symbol Error Rate in Correlated MIMO Systems with Precoding and Space-time Coding RadioVetenskap och Kommunikation (RVK 08) Proceedings of the twentieth Nordic Conference on Radio
More informationNUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS
NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS Justin Dauwels Dept. of Information Technology and Electrical Engineering ETH, CH-8092 Zürich, Switzerland dauwels@isi.ee.ethz.ch
More informationSolutions to Homework Set #4 Differential Entropy and Gaussian Channel
Solutions to Homework Set #4 Differential Entropy and Gaussian Channel 1. Differential entropy. Evaluate the differential entropy h(x = f lnf for the following: (a Find the entropy of the exponential density
More informationTransmit Directions and Optimality of Beamforming in MIMO-MAC with Partial CSI at the Transmitters 1
2005 Conference on Information Sciences and Systems, The Johns Hopkins University, March 6 8, 2005 Transmit Directions and Optimality of Beamforming in MIMO-MAC with Partial CSI at the Transmitters Alkan
More informationWhat is the Value of Joint Processing of Pilots and Data in Block-Fading Channels?
What is the Value of Joint Processing of Pilots Data in Block-Fading Channels? Nihar Jindal University of Minnesota Minneapolis, MN 55455 Email: nihar@umn.edu Angel Lozano Universitat Pompeu Fabra Barcelona
More informationMinimum Feedback Rates for Multi-Carrier Transmission With Correlated Frequency Selective Fading
Minimum Feedback Rates for Multi-Carrier Transmission With Correlated Frequency Selective Fading Yakun Sun and Michael L. Honig Department of ECE orthwestern University Evanston, IL 60208 Abstract We consider
More informationInterference, Cooperation and Connectivity A Degrees of Freedom Perspective
Interference, Cooperation and Connectivity A Degrees of Freedom Perspective Chenwei Wang, Syed A. Jafar, Shlomo Shamai (Shitz) and Michele Wigger EECS Dept., University of California Irvine, Irvine, CA,
More informationTowards control over fading channels
Towards control over fading channels Paolo Minero, Massimo Franceschetti Advanced Network Science University of California San Diego, CA, USA mail: {minero,massimo}@ucsd.edu Invited Paper) Subhrakanti
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationLecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1
: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1 Rayleigh Friday, May 25, 2018 09:00-11:30, Kansliet 1 Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless
More informationUnder sum power constraint, the capacity of MIMO channels
IEEE TRANSACTIONS ON COMMUNICATIONS, VOL 6, NO 9, SEPTEMBER 22 242 Iterative Mode-Dropping for the Sum Capacity of MIMO-MAC with Per-Antenna Power Constraint Yang Zhu and Mai Vu Abstract We propose an
More informationChannel Dispersion and Moderate Deviations Limits for Memoryless Channels
Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability
More informationCommunication Over MIMO Broadcast Channels Using Lattice-Basis Reduction 1
Communication Over MIMO Broadcast Channels Using Lattice-Basis Reduction 1 Mahmoud Taherzadeh, Amin Mobasher, and Amir K. Khandani Coding & Signal Transmission Laboratory Department of Electrical & Computer
More informationMorning Session Capacity-based Power Control. Department of Electrical and Computer Engineering University of Maryland
Morning Session Capacity-based Power Control Şennur Ulukuş Department of Electrical and Computer Engineering University of Maryland So Far, We Learned... Power control with SIR-based QoS guarantees Suitable
More informationOn Compound Channels With Side Information at the Transmitter
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 4, APRIL 2006 1745 On Compound Channels With Side Information at the Transmitter Patrick Mitran, Student Member, IEEE, Natasha Devroye, Student Member,
More informationOn the Low-SNR Capacity of Phase-Shift Keying with Hard-Decision Detection
On the Low-SNR Capacity of Phase-Shift Keying with Hard-Decision Detection ustafa Cenk Gursoy Department of Electrical Engineering University of Nebraska-Lincoln, Lincoln, NE 68588 Email: gursoy@engr.unl.edu
More informationAn Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 4, APRIL 2012 2427 An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel Ersen Ekrem, Student Member, IEEE,
More informationEE 4TM4: Digital Communications II Scalar Gaussian Channel
EE 4TM4: Digital Communications II Scalar Gaussian Channel I. DIFFERENTIAL ENTROPY Let X be a continuous random variable with probability density function (pdf) f(x) (in short X f(x)). The differential
More informationBinary Transmissions over Additive Gaussian Noise: A Closed-Form Expression for the Channel Capacity 1
5 Conference on Information Sciences and Systems, The Johns Hopkins University, March 6 8, 5 inary Transmissions over Additive Gaussian Noise: A Closed-Form Expression for the Channel Capacity Ahmed O.
More informationLecture 14 February 28
EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables
More informationIEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 61, NO. 3, MARCH
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 6, NO., MARCH 05 57 On the Two-User Interference Channel With Lack of Knowledge of the Interference Codebook at One Receiver Alex Dytso, Daniela Tuninetti,
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More informationSpace-Time Coding for Multi-Antenna Systems
Space-Time Coding for Multi-Antenna Systems ECE 559VV Class Project Sreekanth Annapureddy vannapu2@uiuc.edu Dec 3rd 2007 MIMO: Diversity vs Multiplexing Multiplexing Diversity Pictures taken from lectures
More informationSteiner s formula and large deviations theory
Steiner s formula and large deviations theory Venkat Anantharam EECS Department University of California, Berkeley May 19, 2015 Simons Conference on Networks and Stochastic Geometry Blanton Museum of Art
More informationOn the Duality of Gaussian Multiple-Access and Broadcast Channels
On the Duality of Gaussian ultiple-access and Broadcast Channels Xiaowei Jin I. INTODUCTION Although T. Cover has been pointed out in [] that one would have expected a duality between the broadcast channel(bc)
More informationTight Lower Bounds on the Ergodic Capacity of Rayleigh Fading MIMO Channels
Tight Lower Bounds on the Ergodic Capacity of Rayleigh Fading MIMO Channels Özgür Oyman ), Rohit U. Nabar ), Helmut Bölcskei 2), and Arogyaswami J. Paulraj ) ) Information Systems Laboratory, Stanford
More informationSoft Covering with High Probability
Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information
More informationCapacity optimization for Rician correlated MIMO wireless channels
Capacity optimization for Rician correlated MIMO wireless channels Mai Vu, and Arogyaswami Paulraj Information Systems Laboratory, Department of Electrical Engineering Stanford University, Stanford, CA
More informationLinear Precoding for Mutual Information Maximization in MIMO Systems
Linear Precoding for Mutual Information Maximization in MIMO Systems Meritxell Lamarca,2 Signal Theory and Communications Department, Technical University of Catalonia, Barcelona (Spain) 2 ECE Department,
More informationPracticable MIMO Capacity in Ideal Channels
Practicable MIMO Capacity in Ideal Channels S. Amir Mirtaheri,Rodney G. Vaughan School of Engineering Science, Simon Fraser University, British Columbia, V5A 1S6 Canada Abstract The impact of communications
More informationOn Capacity Under Received-Signal Constraints
On Capacity Under Received-Signal Constraints Michael Gastpar Dept. of EECS, University of California, Berkeley, CA 9470-770 gastpar@berkeley.edu Abstract In a world where different systems have to share
More informationChapter 4: Continuous channel and its capacity
meghdadi@ensil.unilim.fr Reference : Elements of Information Theory by Cover and Thomas Continuous random variable Gaussian multivariate random variable AWGN Band limited channel Parallel channels Flat
More information