Lecture 4 Capacity of Wireless Channels I-Hsiang Wang ihwang@ntu.edu.tw 3/0, 014
What we have learned So far: looked at specific schemes and techniques Lecture : point-to-point wireless channel - Diversity: combat fading by exploiting inherent diversity - Coding: combat noise, and further exploits degrees of freedom Lecture 3: cellular system - Multiple access: TDMA, CDMA, OFDMA - Interference management: orthogonalization (partial frequency reuse), treat-interference-as-noise (interference averaging)
Information Theory Is there a framework to - Compare all schemes and techniques fairly? - Assert what is the fundamental limit on how much rate can be reliably delivered over a wireless channel? Information theory! - Provides a fundamental limit to (coded) performance - Identifies the impact of channel resources on performance - Suggest novel techniques to communicate over wireless channels Information theory provides the basis for the modern development of wireless communication 3
Historical Perspective G. Marconi C. Shannon 1901 1948 First radio built 100+ years ago Great stride in technology But design was somewhat ad-hoc Information theory: every channel has a capacity Provides a systematic view of all communication problems Engineering meets science New points of view arise 4
Modern View on Multipath Fading Channel quality Time Classical view: fading channels are unreliable - Diversity techniques: average out the variation Modern view: exploit fading to gain spectral efficiency - Thanks to the study on fading channel through the lens of information theory! 5
Plot Use a heuristic argument (geometric) to introduce the capacity of the AWGN channel Discuss the two key resources in the AWGN channel: - Power - Bandwidth The AWGN channel capacity serves as a building block towards fading channel capacity: - Slow fading channel: outage capacity - Fast fading channel: ergodic capacity 6
AWGN Channel Capacity Outline Resources of the AWGN Channel Capacity of some LTI Gaussian Channels Capacity of Fading Channels 7
AWGN Channel Capacity 8
Channel Capacity Capacity := the highest data rate can be delivered reliably over a channel - Reliably Vanishing error probability Before Shannon, it was widely believed that: - to communicate with error probability 0 - data rate must also 0 Repetition coding (with M-level PAM) over N time slots on AWGN channel: r! 6N M 3 SNR - Error probability Q - Data rate = log M N - As long as M N ⅓, the error probability 0 as N - But, the data rate! =! log N! still 0 as N 3N 9
Channel Coding Theorem For every memoryless channel, there is a definite number C that is computable such that: - If the data rate R < C, then there exists a coding scheme that can deliver rate R data over the channel with error probability 0 as the coding block length N - Conversely, if the data rate R > C, then no matter what coding scheme is used, the error probability 1 as N We shall focus on the additive white Gaussian noise (AWGN) channel - Give a heuristic argument to derive the AWGN channel capacity 10
AWGN Channel Power constraint: NX x[n] apple NP n=1 x[n] z[n] N (0, ) y[n] =x[n]+z[n] We consider real-valued Gaussian channel As mentioned earlier, repetition coding yield zero rate if the error probability is required to vanish as N Because all codewords are spread on a single dimension in an N-dimensional space How to do better? 11
Sphere Packing Interpretation y = x + z R N By the law of large numbers, as N, most y will lie p N(P + ) inside the N-dimensional sphere of radius p N(P + ) p N Also by the LLN, as N, y will lie near the surface of the N-dimensional sphere centered at x with radius p N Vanishing error probability non-overlapping spheres How many non-overlapping spheres can be packed into the large sphere? 1
Why Repetition Coding is Bad R N p N(P + ) y = x + z It only uses one dimension out of N! 13
Capacity Upper Bound R N p N(P + ) p N y = x + z Maximum # of non-overlapping spheres = Maximum # of codewords that can be reliably delivered p N(P + NR ) N apple p N N =) R apple 1 N log p N(P + ) N p N N! = 1 log 1+ P This is hence an upper bound of the capacity C. How to achieve it? 14
Achieving Capacity (1/3) (random) Encoding: randomly generate NR codewords {x1, x,...} lying inside the x-sphere of radius Decoding: Performance analysis: WLOG let x 1 is sent - By the LLN, := P P + y! MMSE! y! Nearest Neighbor! bx y x 1 = w +( 1)x 1 N +( 1) NP = N P - As long as αy lies inside the uncertainty sphere centered at x1 r with radius!!!, decoding will be correct - Pairwise error probability (see next slide) = N P P + P + p NP P + N/ 15
Achieving Capacity (/3) x-sphere p NP When does an error occur? Ans: when another codeword falls inside the uncertainty sphere of x1 r N P P + x 1 What is that probability (pairwise error probability)? Ans: the ratio of the volume of the two spheres Pr {x 1! x } = = p NP /(P + p N NP P + N/ ) N Union bound: Total error probability apple NR P + N/ 16
Achieving Capacity (3/3) Total error probability (by union bound) Pr {E} apple NR P + N/ = N R+ 1 log 1 1+ P!! As long as the following holds, Pr {E}! 0 as N!1 R< 1 log 1+ P Hence, indeed the capacity is C = 1 1+ log P bits per symbol time 17
Resources of AWGN Channel 18
Continuous- Time AWGN Channel System parameters: - Power constraint: P watts; Bandwidth: W Hz - Spectral density of the white Gaussian noise: N0/ Equivalent discrete-time baseband channel (complex) - 1 complex symbol = real symbols Capacity: Power constraint: NX x[n] apple NP n=1 x[n] z[n] CN(0,N 0 W ) y[n] =x[n]+z[n] C AWGN (P, W) = 1 log 1+ P/ bits per symbol time N 0 W/ SNR := P/N = W log 1+ P 0 W SNR per complex symbol bits/s = log (1 + SNR) bits/s/hz N 0 W 19
Complex AWGN Channel Capacity C AWGN (P, W) =W log 1+ P N 0 W bits/s = log (1 + SNR) bits/s/hz Spectral Efficiency The capacity formula provides a high-level way of thinking about how the performance fundamentally depends on the basic resources available in the channel No need to go into details of specific coding and modulation schemes Basic resources: power P and bandwidth W 0
Power SNR = P N 0 W log (1 + SNR) 7 6 5 4 3 1 Fix W: High SNR: - Logarithmic growth with power Low SNR: - Linear growth with power 0 0 0 40 60 80 100 SNR C = log(1 + SNR) log SNR C = log(1 + SNR) SNR log e 1
Bandwidth Fix P C(W )=W log 1+ P N 0 W W P N 0 W log e = P N 0 log e P N 0 log e 1.6 1.4 1. Power limited region 1 C(W ) (Mbps) 0.8 0.6 Capacity Limit for W 0.4 Bandwidth limited region 0. 0 0 5 10 15 0 Bandwidth W (MHz) 5 30
Bandwidth- limited vs. Power- limited C AWGN (P, W) =W log SNR = When SNR 1: (Power-limited regime) C AWGN (P, W) W P N 0 W 1+ P N 0 W - Linear in power; Insensitive to bandwidth When SNR 1: (Bandwidth-limited regime) - Logarithmic in power; Approximately linear in bandwidth P N 0 W C AWGN (P, W) W log bits/s log e = P N 0 log e P N 0 W 3
Capacity of Some LTI Gaussian Channels 4
SIMO Channel x h y y = hx + w C L # h MRC, h # ey = h x + ew, ew CN 0, Power constraint: P w CN 0, I L MRC is a lossless operation: we can generate y from ey : y = ey (h/ h ) Hence the SIMO channel capacity is equal to the capacity of the equivalent AWGN channel, which is C SIMO = log 1+ h P Power gain due to Rx beamforming 5
MISO Channel x h h = h 1 y h y = h x + w C, # Tx Beamforming x = xh/ h # y = x h + w x, h C L Power constraint: NX x apple NP n=1 Goal: maximize the received power h* x - The answer is h P! (check. Hint: Cauchy Schwarz inequality) Achieved by Tx beamforming - Send a scalar symbol x on the direction of h - Power constraint on x : still P Capacity: C MISO = log 1+ h P 6
Frequency- Selective Channel y[m] = LX 1 h l x[m l]+w[m] l=0 Key idea 1: use OFDM to convert the channel with ISI into a bunch of parallel AWGN channels - But there is loss/overhead due to cyclic prefix Key idea : CP overhead 0 as N c First focus on finding the capacity of parallel AWGN channels of any finite Nc Then take N c to find the capacity of the frequencyselective channel 7
Recap: OFDM x [1] = d[n L + 1] y[1] x [L 1] = d[n 1] y[l 1] d 0 d[0] Cyclic prefix x [L] = d[0] Channel y[l] Remove prefix y[l] ỹ 0 IDFT DFT d N 1 d[n 1] x [N + L 1] = d[n 1] y[n + L 1] y [N + L 1] ỹ N 1 y := y[l : N c + L 1], w := w[l : N c + L 1], h := h 0 h 1 h L 1 0 0 T ey n = e h n e dn + ew n, n =0, 1,...,N c 1 Nc parallel AWGN channels ey n := DFT (y) n, e dn := DFT (d) n, ew n := DFT (w) n, e hn := p N c DFT (h) n 8
Parallel AWGN Channels e h0 ew 0 [m] Parallel Channels ed 0 [m] ey 0 [m] ey n = e h n e dn + ew n, n [0 : 1 : N c 1] ed 1 [m] e h1 ew 1 [m] ey 1 [m] Equivalent Vector Channel ey = H e d e + ew ew CN 0, eh = diag eh0,..., e h Nc 1 I... ed Nc 1[m] e hnc 1 ew Nc 1[m] ey Nc m =1,,...,M (M channel uses) 1[m] Power Constraint MX d[n] e apple MN c P n=1 Due to Parseval theorem of DFT 9
Independent Uses of Parallel Channels One way to code over such parallel channels (a special case of a vector channel): treat each channel separately - It turns out that coding across parallel channels does not help! Power allocation: - Each of the Nc channels get a portion of the total power NX - c 1 Channel n gets power Pn, which must satisfy P n apple N c P For a given power allocation {P n}, the following rate can be achieved: R = NX c 1 n=0 log 1+ e h n P n! n=0 30
Optimal Power Allocation Power allocation problem: max P 0,...,P N c 1 subject to NX c 1 n=0 NX c 1 n=0 log 1+ e h n P n It can be solved explicitly by Lagrangian methods!, P n = N c P, P n 0, n =0,...,N c 1 Final solution: let (x)+ := max(x, 0) P n = e h n! +, satisfies NX c 1 n=0 e h n! + = N c P 31
Water]illing e h n P* 1 = 0 * P * P 3 Note: e h n = H b nw N c Subcarrier n Baseband frequency response at f = nw/nc 3
Frequency- Selective Channel Capacity Final step: making N c - Replace all!!!!! by Hb(f), summation over [0 : Nc 1] becomes integration from 0 to W Power allocation problem becomes Optimal solution becomes P (f) = max P (f) Z W 0 subject to e hn = H b nw N c log H(f) 1+ H(f) P (f) Z W 0 +, df, P (f) =P, P(f) 0, f [0,W] satisfies Z W 0 H(f) + df = P 33
Water]illing over the Frequecy Spectrum H(f) 4 3.5 3.5 1.5 1 P *( f ) 0.5 0 0.4W 0.W 0 0.W 0.4W Frequency ( f ) 34