Lecture 15: Thu Feb 28, 2019 Announce: HW5 posted Lecture: The AWGN waveform channel Projecting temporally AWGN leads to spatially AWGN sufficiency of projection: irrelevancy theorem in waveform AWGN: ML = min-distance Power-vs-bandwidth: introduction Shannon capacity 368
M-ary Detection in AWGN Transmitter sends s( t ) {s 1 ( t ) s M ( t ) } Receiver observes r( t ) = s( t ) + n( t ) s 1 ( t ) AWGN s 2 ( t ) s( t ) r( t ) s 3 ( t ) What s new: Assume n( t ) is white and Gaussian with PSD S n ( f )= N 0 /2. 369
AWGN additive white Gaussian noise S n ( f ) = N ------ 0 2 N ------ 0 2 0 f N 0 E(n( t )n(t + )) = ------ ( ) 2 0 370
Is This White Noise? n( t )...... t 371
Is This White Noise? n( t )... t... 372
This is White Noise n( t )... t... 373
Irrelevancy Theorem: Projection onto S yields Sufficient Statistics 1 ( t ) t = 0 r 1 AWGN n( t ) S n ( f ) = N 0 /2 r( t ) s( t ) 2 ( t ) t = 0 r 2 N ( t ) t = 0 r N 374
Equivalent Vector Channel The received vector where n = [n 1, n N ] T and where r = s + n, n k = n( t ), k ( t ) = n( t ) k ( t )dt Projection onto signal space transforms the waveform channel into a vector channel. 375
Statistics of the Noise Vector Fact: When the following three conditions are met: n( t ) is white and Gaussian with PSD N 0 /2 { 1 ( t ) N ( t ) } are orthonormal n i = n( t ), i ( t ) then: {n 1, n N } are i.i.d. N(0, N 0 /2). Temporally white Gaussian noise leads to spatially white Gaussian noise vector. 376
Proof 1. E(n i ) = E n( t ) i ( t )dt = E(n( t )) i ( t )dt = 0. 2. E(n i n j ) = E n( t ) i ( t )dt n( ) j ( )d = E n( t )n( ) i ( t ) j ( )dtd N 0 = ------ ( t ) i ( t ) j ( )dtd 2 N 0 ------ 2 = i ( t ) j ( t)dt N 0 = ------ i,j. 2 3. Inner products are linear Jointly Gaussian. 4. Uncorrelated and Jointly Gaussian Independent. 377
Irrelevancy Theorem: Projecting onto S is Sufficient Write r( t ) = s( t ) + n( t ) = s( t ) + ˆn( t ) + n( t ) where n( t )= n( t ) ˆn( t ) is the noise projection error: It is independent of the transmitted signal It is independent of each n i : E[n( t ) n i ] = E n( t ) X N j=1 n j j ( t ) n i = E n( t ) n( ) i ( )d X N j=1 E[n in j ] j ( t ) N 0 ------ 2 N 0 ------ 2 = i ( t ) i ( t ) = 0. n( t ) is irrelevant: P(s i r, n( t )) = P(s i r). 378
Geometric Picture r( t ) n( t ) n ( t ) s m ( t ) ˆn( t ) ˆr( t ) S n ( t ) is irrelevant, ˆr( t ) provides sufficient statistics 379
ML for AWGN Waveform Channel Projection onto signal space produces sufficient statistics: r = s + n, where {n 1, n N } are i.i.d. N(0, N 0 /2). The ML detector chooses s i {s 1 s M } to maximize f(r s i ) = 1 (N 0 ) N/ 2 Equivalently, to minimize r s i 2. /N 0 2 --------------------- e r s i min-distance is ML when noise is AWGN for scalar channel X vector channel X waveform channel 380
Coming Next: Power vs Bandwidth 381
Communication Across Bandlimited Noisy Channel s( t ) 1 S n = N 0 ----- AWGN 2 r( t ) W W Nyquist can feed x k to DAC (with rate 2W) to create s( t ) A suboptimal strategy: Finite alphabet x k A, independent and uniform R b = 2Wlog 2 A Size of alphabet limited by: target reliability, i.e. P e transmit power noise power 382
Shannon Capacity Better to avoid the independent assumption and code in blocks of length N: x = x 0 x 1 x 2... x N 1 x k UNIT ENERGY DAC s( t ) 1 W AWGN S n = N 0 ----- 2 r( t ) A.A. ADC UNIT ENERGY r k r = r 0 r 1 r 2... r N 1 sample rate -- 1 = 2W T N-dimensional vector channel r = x + n Pop Quiz: How does power constraint on s( t ) translate to x? 383
Answer Each x k ( t kt) has energy x k 2 Starting with definition of power, with = NT: P = lim -- s 2 ( t )dt 1 /2 / 2 X 1 = lim N ------- E(x 2 i ) T NT i x i 2 = --------------- (not surprising, since power is -------------------- energy ) unit time P S x = E(x 2 i ) PT = --------. 2W 384
Noise Norm Becomes Deterministic! n 0 n 1 n 2 As N, what happens to norm of n =?... n N 1 385
Noise Norm Becomes Deterministic! n 0 n 1 n 2 As N, what happens to norm of n =?... n N 1 By LLN, squared norm knk 2 = X i n i 2 NS n 386
Noise Norm Becomes Deterministic! n 0 n 1 n 2 As N, what happens to norm of n =?... n N 1 By LLN, squared norm knk 2 = X i n i 2 NS n n lives on surface of N-dimensional sphere of radius NS n : r = NS n 387
Spheres in Large Dimensions An N = 100-dimensional golden sphere is worth $1M. Would you rather have the inner sphere or outer shell? r = 10 r = 9 388
Spheres in Large Dimensions An N = 100-dimensional golden sphere is worth $1M. Would you rather have the inner sphere or outer shell? r = 10 r = 9 V ---------- inner V outer c = N 9 --------------- N = 0.9 N = 0.9 100 $26.56 c N 10 N 99.997% of volume ($999,973.44) is in outer shell! 389
How Many Ping Pong Balls in a Beach Ball? r = NS n r = N(S x + S n ) 390
Ratio of Volumes V beach maximum #codewords ---------------- = Vpingpong c -------------------------------------------------- n (N(S x + S n )) N/2 c n (NS n ) N/2 = (1 S ----- x S n + ) N/2 log Bit rate R b C = 2 (#codewords) ---------------------------------------------- NT 1 S = ------ log 2 (1 + ----- x ) 2T S n P = Wlog 2 (1 + ------------). N 0 W P spectral efficiency R b /W log 2 (1 + ------------ ). N 0 W 391
P From R b = Wlog 2 (1 + ------------ ) N 0 W Shannon Capacity P ------------ = 2 R b/w 1 N 0 W E b /N 0 = ------------- P 2 R b/w = ------------------------ 1 = Shannon limit on E b /N N 0 0 R b R b /W E.g. E b /N 0 = 1 = 0 db for 1 bps/hz; E b /N 0 = 102.3 = 20.1 db for 10 bps/hz; E b /N 0 = ln2 = 1.59 db for 0 bps/hz; Interpretations: W/R b = normalized bandwidth requirement R b /W = = spectral efficiency [bps/hz] 392
Power versus Bandwidth Trade-Off 40 30 E b /N 0 (db) 20 10 SHANNON LIMIT 0 0 0.2 0.4 0.6 0.8 1 W/R b 393