Lecture 15: Thu Feb 28, 2019

Similar documents
Lecture 18: Gaussian Channel

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

ELEC546 Review of Information Theory

Principles of Communications

Chapter 9 Fundamental Limits in Information Theory

Lecture 2. Capacity of the Gaussian channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Chapter 4: Continuous channel and its capacity

Modulation & Coding for the Gaussian Channel

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

16.36 Communication Systems Engineering

Appendix B Information theory from first principles

Lecture 4. Capacity of Fading Channels

Gaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26

Communication Theory II

Revision of Lecture 5

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

392D: Coding for the AWGN Channel Wednesday, January 24, 2007 Stanford, Winter 2007 Handout #6. Problem Set 2 Solutions

Lecture 8: Shannon s Noise Models

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10

Lecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1

a) Find the compact (i.e. smallest) basis set required to ensure sufficient statistics.

Capacity Penalty due to Ideal Zero-Forcing Decision-Feedback Equalization

19. Channel coding: energy-per-bit, continuous-time channels

Lecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH

Single-User MIMO systems: Introduction, capacity results, and MIMO beamforming

Capacity of AWGN channels

Shannon s noisy-channel theorem

EE456 Digital Communications

Lecture 8: MIMO Architectures (II) Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH

for some error exponent E( R) as a function R,

Space-Time Coding for Multi-Antenna Systems

ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals

Information Theory - Entropy. Figure 3

Multiple-Input Multiple-Output Systems

Lecture 6 Channel Coding over Continuous Channels

Lecture 4 Noisy Channel Coding

Block 2: Introduction to Information Theory

Asymptotic Capacity Bounds for Magnetic Recording. Raman Venkataramani Seagate Technology (Joint work with Dieter Arnold)

MITOCW ocw-6-451_ feb k_512kb-mp4

7 The Waveform Channel

Entropies & Information Theory

Lecture 14 February 28

EE 574 Detection and Estimation Theory Lecture Presentation 8

Lecture 4 Capacity of Wireless Channels

Digital Modulation 1

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

2016 Spring: The Final Exam of Digital Communications

Error Exponent Regions for Gaussian Broadcast and Multiple Access Channels

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220

A Hilbert Space for Random Processes

Lecture 5 Channel Coding over Continuous Channels

ELEN E4810: Digital Signal Processing Topic 11: Continuous Signals. 1. Sampling and Reconstruction 2. Quantization

CHANNEL CAPACITY CALCULATIONS FOR M ARY N DIMENSIONAL SIGNAL SETS

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

Summary: SER formulation. Binary antipodal constellation. Generic binary constellation. Constellation gain. 2D constellations

A First Course in Digital Communications

EE401: Advanced Communication Theory

Error Exponent Region for Gaussian Broadcast Channels

EE4512 Analog and Digital Communications Chapter 4. Chapter 4 Receiver Design


Summary II: Modulation and Demodulation

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Random Processes Handout IV

Lecture 4 Capacity of Wireless Channels

Lattice Coding I: From Theory To Application

Equalization. Contents. John Barry. October 5, 2015

ITCT Lecture IV.3: Markov Processes and Sources with Memory

A Simple Example Binary Hypothesis Testing Optimal Receiver Frontend M-ary Signal Sets Message Sequences. possible signals has been transmitted.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Chapter 2 Signal Processing at Receivers: Detection Theory

MMSE estimation and lattice encoding/decoding for linear Gaussian channels. Todd P. Coleman /22/02

EE303: Communication Systems

Coding theory: Applications

ANALYSIS OF A PARTIAL DECORRELATOR IN A MULTI-CELL DS/CDMA SYSTEM

Transmitting k samples over the Gaussian channel: energy-distortion tradeoff

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Introduction to Signal Spaces

Lecture 5: Asymptotic Equipartition Property

Optimum Soft Decision Decoding of Linear Block Codes

Interactions of Information Theory and Estimation in Single- and Multi-user Communications

Lecture 2. Fading Channel

Lecture 5b: Line Codes

7 K-D MAP. Tuesday, August 13, :24 PM. ECS452 7 Page 1

ρ = sin(2π ft) 2π ft To find the minimum value of the correlation, we set the derivative of ρ with respect to f equal to zero.

EE4304 C-term 2007: Lecture 17 Supplemental Slides

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

LECTURE 18. Lecture outline Gaussian channels: parallel colored noise inter-symbol interference general case: multiple inputs and outputs

Performance Analysis of Spread Spectrum CDMA systems

Solutions to Selected Problems

Performance of small signal sets

Universal Anytime Codes: An approach to uncertain channels in control

Cover s Open Problem: The Capacity of the Relay Channel

One Lesson of Information Theory

Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems

Lecture 8: Channel Capacity, Continuous Random Variables

Simultaneous SDR Optimality via a Joint Matrix Decomp.

1.1 Basis of Statistical Decision Theory

Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information

Transcription:

Lecture 15: Thu Feb 28, 2019 Announce: HW5 posted Lecture: The AWGN waveform channel Projecting temporally AWGN leads to spatially AWGN sufficiency of projection: irrelevancy theorem in waveform AWGN: ML = min-distance Power-vs-bandwidth: introduction Shannon capacity 368

M-ary Detection in AWGN Transmitter sends s( t ) {s 1 ( t ) s M ( t ) } Receiver observes r( t ) = s( t ) + n( t ) s 1 ( t ) AWGN s 2 ( t ) s( t ) r( t ) s 3 ( t ) What s new: Assume n( t ) is white and Gaussian with PSD S n ( f )= N 0 /2. 369

AWGN additive white Gaussian noise S n ( f ) = N ------ 0 2 N ------ 0 2 0 f N 0 E(n( t )n(t + )) = ------ ( ) 2 0 370

Is This White Noise? n( t )...... t 371

Is This White Noise? n( t )... t... 372

This is White Noise n( t )... t... 373

Irrelevancy Theorem: Projection onto S yields Sufficient Statistics 1 ( t ) t = 0 r 1 AWGN n( t ) S n ( f ) = N 0 /2 r( t ) s( t ) 2 ( t ) t = 0 r 2 N ( t ) t = 0 r N 374

Equivalent Vector Channel The received vector where n = [n 1, n N ] T and where r = s + n, n k = n( t ), k ( t ) = n( t ) k ( t )dt Projection onto signal space transforms the waveform channel into a vector channel. 375

Statistics of the Noise Vector Fact: When the following three conditions are met: n( t ) is white and Gaussian with PSD N 0 /2 { 1 ( t ) N ( t ) } are orthonormal n i = n( t ), i ( t ) then: {n 1, n N } are i.i.d. N(0, N 0 /2). Temporally white Gaussian noise leads to spatially white Gaussian noise vector. 376

Proof 1. E(n i ) = E n( t ) i ( t )dt = E(n( t )) i ( t )dt = 0. 2. E(n i n j ) = E n( t ) i ( t )dt n( ) j ( )d = E n( t )n( ) i ( t ) j ( )dtd N 0 = ------ ( t ) i ( t ) j ( )dtd 2 N 0 ------ 2 = i ( t ) j ( t)dt N 0 = ------ i,j. 2 3. Inner products are linear Jointly Gaussian. 4. Uncorrelated and Jointly Gaussian Independent. 377

Irrelevancy Theorem: Projecting onto S is Sufficient Write r( t ) = s( t ) + n( t ) = s( t ) + ˆn( t ) + n( t ) where n( t )= n( t ) ˆn( t ) is the noise projection error: It is independent of the transmitted signal It is independent of each n i : E[n( t ) n i ] = E n( t ) X N j=1 n j j ( t ) n i = E n( t ) n( ) i ( )d X N j=1 E[n in j ] j ( t ) N 0 ------ 2 N 0 ------ 2 = i ( t ) i ( t ) = 0. n( t ) is irrelevant: P(s i r, n( t )) = P(s i r). 378

Geometric Picture r( t ) n( t ) n ( t ) s m ( t ) ˆn( t ) ˆr( t ) S n ( t ) is irrelevant, ˆr( t ) provides sufficient statistics 379

ML for AWGN Waveform Channel Projection onto signal space produces sufficient statistics: r = s + n, where {n 1, n N } are i.i.d. N(0, N 0 /2). The ML detector chooses s i {s 1 s M } to maximize f(r s i ) = 1 (N 0 ) N/ 2 Equivalently, to minimize r s i 2. /N 0 2 --------------------- e r s i min-distance is ML when noise is AWGN for scalar channel X vector channel X waveform channel 380

Coming Next: Power vs Bandwidth 381

Communication Across Bandlimited Noisy Channel s( t ) 1 S n = N 0 ----- AWGN 2 r( t ) W W Nyquist can feed x k to DAC (with rate 2W) to create s( t ) A suboptimal strategy: Finite alphabet x k A, independent and uniform R b = 2Wlog 2 A Size of alphabet limited by: target reliability, i.e. P e transmit power noise power 382

Shannon Capacity Better to avoid the independent assumption and code in blocks of length N: x = x 0 x 1 x 2... x N 1 x k UNIT ENERGY DAC s( t ) 1 W AWGN S n = N 0 ----- 2 r( t ) A.A. ADC UNIT ENERGY r k r = r 0 r 1 r 2... r N 1 sample rate -- 1 = 2W T N-dimensional vector channel r = x + n Pop Quiz: How does power constraint on s( t ) translate to x? 383

Answer Each x k ( t kt) has energy x k 2 Starting with definition of power, with = NT: P = lim -- s 2 ( t )dt 1 /2 / 2 X 1 = lim N ------- E(x 2 i ) T NT i x i 2 = --------------- (not surprising, since power is -------------------- energy ) unit time P S x = E(x 2 i ) PT = --------. 2W 384

Noise Norm Becomes Deterministic! n 0 n 1 n 2 As N, what happens to norm of n =?... n N 1 385

Noise Norm Becomes Deterministic! n 0 n 1 n 2 As N, what happens to norm of n =?... n N 1 By LLN, squared norm knk 2 = X i n i 2 NS n 386

Noise Norm Becomes Deterministic! n 0 n 1 n 2 As N, what happens to norm of n =?... n N 1 By LLN, squared norm knk 2 = X i n i 2 NS n n lives on surface of N-dimensional sphere of radius NS n : r = NS n 387

Spheres in Large Dimensions An N = 100-dimensional golden sphere is worth $1M. Would you rather have the inner sphere or outer shell? r = 10 r = 9 388

Spheres in Large Dimensions An N = 100-dimensional golden sphere is worth $1M. Would you rather have the inner sphere or outer shell? r = 10 r = 9 V ---------- inner V outer c = N 9 --------------- N = 0.9 N = 0.9 100 $26.56 c N 10 N 99.997% of volume ($999,973.44) is in outer shell! 389

How Many Ping Pong Balls in a Beach Ball? r = NS n r = N(S x + S n ) 390

Ratio of Volumes V beach maximum #codewords ---------------- = Vpingpong c -------------------------------------------------- n (N(S x + S n )) N/2 c n (NS n ) N/2 = (1 S ----- x S n + ) N/2 log Bit rate R b C = 2 (#codewords) ---------------------------------------------- NT 1 S = ------ log 2 (1 + ----- x ) 2T S n P = Wlog 2 (1 + ------------). N 0 W P spectral efficiency R b /W log 2 (1 + ------------ ). N 0 W 391

P From R b = Wlog 2 (1 + ------------ ) N 0 W Shannon Capacity P ------------ = 2 R b/w 1 N 0 W E b /N 0 = ------------- P 2 R b/w = ------------------------ 1 = Shannon limit on E b /N N 0 0 R b R b /W E.g. E b /N 0 = 1 = 0 db for 1 bps/hz; E b /N 0 = 102.3 = 20.1 db for 10 bps/hz; E b /N 0 = ln2 = 1.59 db for 0 bps/hz; Interpretations: W/R b = normalized bandwidth requirement R b /W = = spectral efficiency [bps/hz] 392

Power versus Bandwidth Trade-Off 40 30 E b /N 0 (db) 20 10 SHANNON LIMIT 0 0 0.2 0.4 0.6 0.8 1 W/R b 393