Multiuser Capacity in Block Fading Channel

Similar documents
Using Noncoherent Modulation for Training

Multiple Antennas in Wireless Communications

Capacity of a Mobile Multiple-Antenna Communication Link in Rayleigh Flat Fading

Optimum Power Allocation in Fading MIMO Multiple Access Channels with Partial CSI at the Transmitters

Lecture 7 MIMO Communica2ons

Capacity of Block Rayleigh Fading Channels Without CSI

Single-User MIMO systems: Introduction, capacity results, and MIMO beamforming

Morning Session Capacity-based Power Control. Department of Electrical and Computer Engineering University of Maryland

USING multiple antennas has been shown to increase the

Capacity Analysis of MIMO Systems with Unknown Channel State Information

Min-Capacity of a Multiple-Antenna Wireless Channel in a Static Rician Fading Environment

Upper Bounds on MIMO Channel Capacity with Channel Frobenius Norm Constraints

Lecture 5: Antenna Diversity and MIMO Capacity Theoretical Foundations of Wireless Communications 1. Overview. CommTh/EES/KTH

Dirty Paper Coding vs. TDMA for MIMO Broadcast Channels

Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels

Optimal Data and Training Symbol Ratio for Communication over Uncertain Channels

12.4 Known Channel (Water-Filling Solution)

Min-Capacity of a Multiple-Antenna Wireless Channel in a Static Ricean Fading Environment

Transmit Directions and Optimality of Beamforming in MIMO-MAC with Partial CSI at the Transmitters 1

Optimal Transmit Strategies in MIMO Ricean Channels with MMSE Receiver

Capacity Pre-log of Noncoherent SIMO Channels via Hironaka s Theorem

Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT. ECE 559 Presentation Hoa Pham Dec 3, 2007

Optimal Sequences, Power Control and User Capacity of Synchronous CDMA Systems with Linear MMSE Multiuser Receivers

Under sum power constraint, the capacity of MIMO channels

Parallel Additive Gaussian Channels

Capacity Pre-Log of SIMO Correlated Block-Fading Channels

Duality, Achievable Rates, and Sum-Rate Capacity of Gaussian MIMO Broadcast Channels

Exploiting Partial Channel Knowledge at the Transmitter in MISO and MIMO Wireless

Schur-convexity of the Symbol Error Rate in Correlated MIMO Systems with Precoding and Space-time Coding

CHANNEL FEEDBACK QUANTIZATION METHODS FOR MISO AND MIMO SYSTEMS

Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf

Wideband Fading Channel Capacity with Training and Partial Feedback

What is the Value of Joint Processing of Pilots and Data in Block-Fading Channels?

Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels

On the Capacity of the Multiple Antenna Broadcast Channel

DETERMINING the information theoretic capacity of

Information Theory for Wireless Communications, Part II:

Optimal Signal Constellations for Fading Space-Time Channels

On Design Criteria and Construction of Non-coherent Space-Time Constellations

Optimal Power Control in Decentralized Gaussian Multiple Access Channels

RECURSIVE TRAINING WITH UNITARY MODULATION FOR CORRELATED BLOCK-FADING MIMO CHANNELS

Tight Lower Bounds on the Ergodic Capacity of Rayleigh Fading MIMO Channels

EE 4TM4: Digital Communications II Scalar Gaussian Channel

Random Matrices and Wireless Communications

Appendix B Information theory from first principles

Unitary Isotropically Distributed Inputs Are Not Capacity-Achieving for Large-MIMO Fading Channels

Capacity of multiple-input multiple-output (MIMO) systems in wireless communications

Advanced Topics in Digital Communications Spezielle Methoden der digitalen Datenübertragung

Optimal Sequences and Sum Capacity of Synchronous CDMA Systems

On Comparability of Multiple Antenna Channels

Lecture 4. Capacity of Fading Channels

Transmitter optimization for distributed Gaussian MIMO channels

Limited Feedback in Wireless Communication Systems

ELEC546 MIMO Channel Capacity

Joint FEC Encoder and Linear Precoder Design for MIMO Systems with Antenna Correlation

Lecture 8: MIMO Architectures (II) Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH

Multiple-Input Multiple-Output Systems

Applications of Lattices in Telecommunications

EE 5407 Part II: Spatial Based Wireless Communications

On the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels

Characterizing the Capacity for MIMO Wireless Channels with Non-zero Mean and Transmit Covariance

Design of MMSE Multiuser Detectors using Random Matrix Techniques

Channel. Feedback. Channel

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

VECTOR QUANTIZATION TECHNIQUES FOR MULTIPLE-ANTENNA CHANNEL INFORMATION FEEDBACK

On the Capacity of MIMO Rician Broadcast Channels

Massive MIMO: Signal Structure, Efficient Processing, and Open Problems II

IN this paper, we show that the scalar Gaussian multiple-access

Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter

Diversity Multiplexing Tradeoff in Multiple Antenna Multiple Access Channels with Partial CSIT

On the Throughput of Proportional Fair Scheduling with Opportunistic Beamforming for Continuous Fading States

EM Channel Estimation and Data Detection for MIMO-CDMA Systems over Slow-Fading Channels

Vector Channel Capacity with Quantized Feedback

Capacity and Power Allocation for Fading MIMO Channels with Channel Estimation Error

Lecture 2. Fading Channel

EE 4TM4: Digital Communications II. Channel Capacity

Competition and Cooperation in Multiuser Communication Environments

Homework 5 Solutions. Problem 1

Information Rates of Time-Varying Rayleigh Fading Channels in Non-Isotropic Scattering Environments

Throughput analysis for MIMO systems in the high SNR regime

Asymptotic Analysis of Wireless Systems with Rayleigh Fading

INVERSE EIGENVALUE STATISTICS FOR RAYLEIGH AND RICIAN MIMO CHANNELS

Cooperative Interference Alignment for the Multiple Access Channel

Fundamentals of Multi-User MIMO Communications

This second form of the power constraint will prove more useful in the upcoming discussion. We will consider several scenarios for the matrix H:. H is

The Water-Filling Game in Fading Multiple Access Channels

WE study the capacity of peak-power limited, single-antenna,

Chapter 4: Continuous channel and its capacity

Capacity optimization for Rician correlated MIMO wireless channels

ACOMMUNICATION situation where a single transmitter

The RF-Chain Limited MIMO System: Part II Case Study of V-BLAST and GMD

The Capacity Region of the Gaussian MIMO Broadcast Channel

2. Matrix Algebra and Random Vectors

On the Optimality of Multiuser Zero-Forcing Precoding in MIMO Broadcast Channels

Approximately achieving the feedback interference channel capacity with point-to-point codes

ELEC546 Review of Information Theory

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Lecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH

MULTIPLE-INPUT multiple-output (MIMO) systems

Pilot Optimization and Channel Estimation for Multiuser Massive MIMO Systems

Transcription:

Multiuser Capacity in Block Fading Channel April 2003 1 Introduction and Model We use a block-fading model, with coherence interval T where M independent users simultaneously transmit to a single receiver equipped with N antennas in a flat-fading environment, where each user has sole access to one of M transmit antennas, and where nobody has any channel state information. The fading is described by an M N complex-valued propagation matrix H, which remains constant for T symbol periods, after which it jumps to a new independent value for another T symbol, and so on. During a coherence interval, the M users collectively transmit a T M complex matrix S, whose columns, representing different users, are statistically independent, and the receiver records a T N complex matrix X. The basic equation describing this channel is as follows: ρ X = M SH + W where W is a T N vector of additive receiver noise, whose components are independent, zero-mean complex Gaussian with unit variance (CN (0, 1)). The components of H are assumed independent and zero mean with unit variance. The independence of the columns of S is necessary if the users are to act with no cooperation among themselves. We enforce an expected power constraints tr E{SS } = T M This constraint, when combined with the normalization 1/ M implies that the total transmitted power remains constant as the number of users changes, and the ρ represents the expected SNR at each receiver antenna. 1

We wish to choose the joint probability density of the components of S, subject to the independence of its M columns, and subject to the power constraint to maximize the mutual information with no CSI available to anyone (receiver and transmitters). I(X; S) := E{log( p(x S) p(x) )} This maximization yields the total throughout or capacity. It is convenient to assume Rayleigh fading, where the independent components of H are distributed as CN (0, 1), although this assumption can be relaxed for some of the asymptotic results. For Rayleigh fading, the conditional density takes the form p(x S) = exp( X ΛX) (1.1) π T N det N Λ where Λ = I T + (ρ/m)ss and I T denote the T T identity matrix. 2 Known Results about Single User Channel 2.1 coherent channel If the receiver somehow knew the random propagation coefficients, the capacity would be greater than for the case of interest where the receiver does not know propagation coefficients. Telatar [3] computes the perfect-knowledge capacity for the case T = 1. It is straightforward to extend his analysis to T > 1. The perfect-knowledge fading link is completely described by the conditional probability density, p(x, H S) = p(x H, S)p(H). The perfectknowledge capacity is obtained by maximizing the mutual information between (X, H) and S with respect to p(s). The mutual information is I(X, H; S) = E log p(x, H S) p(x, H) = E log p(x H, S) p(x H) p(x H, S) = E{E{log( p(x H) ) H}}. The inner expectation, conditioned on H, is simply the mutual information for the classical additive Gaussian noise case and is maximized by making 2

the components of S independent CN (0, 1). The resulting perfect-knowledge capacity is C = E log det[i N + ρ M H H]. The asymptotical capacity can be described as follows: lim C = N log(1 + ρ). M Using the identity det[i N + ρ M H H] = det[i M + ρ M HH ], we can derive the following limiting result: 2.2 non-coherent channel lim C = M log(1 + ρn M M ). The following lemma introduce Cholesky factorization: Lemma 2.1. For a positive definite matrix A C T T, we have the following factorization: where L is lower triangle matrix. A = LL, Theorem 2.1. [1] For any coherence interval T and any number of receiver antennas, the capacity obtained with M > T transmitter antennas is the same as the capacity obtained with M = T transmitter antennas. Proof. Suppose that a particular joint probability density of the elements of SS achieve capacity with M > T antennas. We can perform the Cholesky factorization SS = LL, where L is a T T lower triangular matrix. Using T transmitter antennas, with a signal matrix that has the same joint probability density as the joint probability density of L, we may therefore also achieve the same probability density on SS, thus we get the same channel conditional probability 1.1. Moreover if S satisfies power constraints, then so does L. 3

Definition 2.1. A T T unitary matrix Φ is called isotropically distributed if its probability density is unchanged when pre-multiplied by a deterministic unitary matrix: p(φ) = p(θφ), Θ : Θ Θ = I Lemma 2.2. Let A Cr m n, then there exist unitary matrices U C m m and V C n n such that A = U [ r 0 0 0 ] V, where r = diag(σ 1, σ 2,, σ r ), σ 1 σ 2 σ r > 0. Theorem 2.2. [1] The signal matrix that achieves capacity can be written as S = ΦV, where Φ is an T T isotropically distributed unitary matrix, and V is an independent T M real, nonnegative, diagonal matrix. Furthermore we can choose the joint density of the diagonal elements of V of be unchanged by rearrangements of its arguments. Proof. Combine the singular value decomposition S = U ΦV and channel conditional density function 1.1, one can prove the same capacity can be achieved by S = ΦV. Suppose S has probability density function p(s ) and generate mutual information I 0. Let Θ be an isotropically distributed unitary matrix that is statistically independent of Φ and V, and define a new signal matrix S = ΘS, generating mutual information I 1. One can check conditioned on Θ, the mutual information generated by S equals I 0. The concavity of mutual information as a function as a function of p(s) and Jensen s inequality imply that I 1 I 0. The same argument can be applied to the distribution of diagonal matrix. Theorem 2.3. [5] C(M, N, T ; ρ) lim ρ log ρ where M = min{m, N, T 2 }. = T M (1 M T ), 4

3 Capacity of Multiuser Channel From [3], we conclude that if perfect CSI were available to the receiver, the components of S would be independent CN (0, 1), and the capacity would increase monotonically with min{m, N}. If perfect CSI were available to the receiver, this single-user capacity could be achieved by M independent users. The absence of CSI drastically changes the problem, and that the total M-user capacity is generally less that single-user/m-antenna capacity. Conjecture 3.1. [2] For multiuser antenna channel, the total capacity for any M > T is equal to to the total capacity for M T. Supporting case 1: T=1 According to Theorem 2.1, for the special case where T = 1, a single user having M antennas can attain the same performance with a single antenna, so the total capacity for M users is equal to the capacity for one user. Supporting case 2: large T, M By chain rule, we have I(X; S) = I(X; S, H) I(X; H S), I(X; S) I(X; S, H). Again by chain rule, the following inequality holds I(X; S, H) = I(X; S H) + I(X; H) I(X; S H). The combination of the above inequalities gives I(X; S H) I(X; H S) I(X; S) I(X; S, H). Thus, we have I(X; S) I(X, H; S) = I(H; S) + I(X; S H) = I(X; S H) I(X; S G H) = T E{log det(i N + ρ M H H)} T log det I N + ρ M E{H H} = T N log(1+ρ), 5

where S G denote zero-mean complex Gaussian, with independent components having unit variance. The mutual information I(X; H S) corresponds to the fictitious case of a user sending the signal H through a random propagation matrix S, where S is known at the receiver. Therefore, the mutual information is maximized by making the elements of H complex Gaussian, which gives I(X; H S) NE{log det(i M + ρ M S S)} N log det(i M + ρ M E{S S}) = N log det(i M + ρt M I M) = MN log(1 + ρt M ). We fix N and simultaneously let T and M become big while maintaining a constant ratio β = T/M. This implies that H H/M I N almost surely. Thus we have I(X; S) I(X; S H) I(X; H S) = T E{log det(i N + ρ M H H)} I(X; H S) T N log(1 + ρ) T N β log(1 + ρβ) The combination of the above derivation gives N log(1 + ρ) N ρ log(1 + ρβ) I(X; S G) T N log(1 + ρ). Finally we let ratio grow big, β = T/M, which yields the asymptotically result I(X; S) lim β T = N log(1 + ρ). The expression N log(1 + ρ) is equal to the capacity for a single user having an unlimited number of transmit antennas, where the receiver has perfect knowledge of the propagation matrix. We have shown that this same capacity can collectively be attained by M independent users, where no CSI is available to anyone, in the limit as T and M become large, with M T. And the capacity could not be increased by having M > T users. 6

Supporting case 3: large ρ Consider the capacity of non-coherent channel described by Theorem 2.3. When N = 1, one can show that having a single operating transmitter yields the optimal possible asymptotical result with respect to ρ. 4 Vector Multiple-Access Channels and Water- Filling [4] In the multiple-antenna model, a baseband model for a synchronous multipleaccess antenna array channel is y(n) = d(n) K x i (n) h s i (n) h f i (n) + w(n). i=1 Here n denotes the time of channel use, x i (n) is the transmitted symbol of user i at time n, and y(n) is an N-dimensional vector of received symbols at the N antenna elements of the array at the receiver. The vector h s i (n) h f i (n) represents the channel from the ith user to the antenna array at time n. The scalars h s i (n) captures the slowly varying component of the fading channel and the vector h f i (n) is the fast varying component of the fading channel. We will assume that { h s i (n)} n and { h f i (n)} n are independent complex stationary and ergodic processes and are perfectly known. The sum capacity of the multiple antenna MAC can be described as follows: = sup P [ 1 log det(i + d 2 (N)N 2N C opt = sup C sum (P) P ] K σ 2 h s i s i s i P i (h s 1,, h 2 K, S). i=1 The power allocation policy P depends on both the slow-fading components h s 1,, h s 1 K and the fast varying components S = N [h f 1,, h f K ]. The calculation can be converted to an optimization problem with constraints. By analyzing the Kuhn-Tucker point, one can have the optimum solution which is called water-filling policy. The main result in [4] is that a water-filling power policy that depends only on the slow fading component of the fading 7

channel is asymptotically optimal(the asymptotic is in the number of users and the correspondingly large number of antennas at the receiver). References [1] Thomas L. Marzetta and Bertrand M. Hochwald. Capacity of a mobile multiple-antenna communication link in Rayleigh flat fading. IEEE Trans. Inform. Theory, 45(1):139 157, 1999. [2] Shlomo Shamai and Thomas L. Marzetta. Multiuser capacity in block fading with no channel state information. IEEE Trans. Inform. Theory, 48(4):938 942, 2002. [3] İ. E. Telatar. Capacity of multi-antenna Gaussian channels. European Trans. Telecommun., pages 585 595, 1999. [4] Pramod Viswanath, David N. C. Tse, and Venkat Anantharam. Asymptotically optimal water-filling in vector multiple-access channels. IEEE Trans. Inform. Theory, 47(1):241 267, 2001. [5] L. Zheng and D. N. C. Tse. Communication on the Grassmann manifold: a geometric approach to the noncoherent multiple-antenna channel. IEEE Trans. Inform. Theory, 48(2):359 383, 2002. 8