Lecture 18: Gaussian Channel

Similar documents
Lecture 2. Capacity of the Gaussian channel

Lecture 14 February 28

Gaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26

Revision of Lecture 5

Lecture 4. Capacity of Fading Channels

Lecture 15: Thu Feb 28, 2019

Chapter 9 Fundamental Limits in Information Theory

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

16.36 Communication Systems Engineering

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Appendix B Information theory from first principles

Principles of Communications

Coding theory: Applications

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Channel Coding 1. Sportturm (SpT), Room: C3165

Lecture 5: Asymptotic Equipartition Property

Lecture 22: Final Review

ELEC546 Review of Information Theory

CSCI 2570 Introduction to Nanocomputing


Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Physical Layer and Coding

Communication Theory II

Shannon s noisy-channel theorem

Lecture 5 Channel Coding over Continuous Channels

Lecture 4 Capacity of Wireless Channels

Chapter 4: Continuous channel and its capacity

Lecture 1. Introduction

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

MMSE estimation and lattice encoding/decoding for linear Gaussian channels. Todd P. Coleman /22/02

Lecture 6 Channel Coding over Continuous Channels

Lecture 4 Capacity of Wireless Channels

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Lecture 20: Quantization and Rate-Distortion

Lecture 4 Noisy Channel Coding

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Revision of Lecture 4

X 1 : X Table 1: Y = X X 2

LECTURE 13. Last time: Lecture outline

Cover s Open Problem: The Capacity of the Relay Channel

Channel Coding I. Exercises SS 2017

for some error exponent E( R) as a function R,

ON BEAMFORMING WITH FINITE RATE FEEDBACK IN MULTIPLE ANTENNA SYSTEMS

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

(Classical) Information Theory III: Noisy channel coding

ECE Information theory Final (Fall 2008)

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Chapter 9. Gaussian Channel

Quantum Information Theory and Cryptography

Entropy, Inference, and Channel Coding

Basic information theory

ECE Information theory Final

On the Secrecy Capacity of Fading Channels

One Lesson of Information Theory

Communication Theory Summary of Important Definitions and Results

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 8: Shannon s Noise Models

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Lecture 12. Block Diagram

Coding for Discrete Source

Optimum Soft Decision Decoding of Linear Block Codes

Block 2: Introduction to Information Theory

On the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels

Energy State Amplification in an Energy Harvesting Communication System

Lecture 7 September 24

Lecture 8: Channel Capacity, Continuous Random Variables

The Duality Between Information Embedding and Source Coding With Side Information and Some Applications

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Title. Author(s)Tsai, Shang-Ho. Issue Date Doc URL. Type. Note. File Information. Equal Gain Beamforming in Rayleigh Fading Channels

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

Capacity of AWGN channels

Lecture 4 Channel Coding

On the Impact of Quantized Channel Feedback in Guaranteeing Secrecy with Artificial Noise

Lecture 17: Differential Entropy

392D: Coding for the AWGN Channel Wednesday, January 24, 2007 Stanford, Winter 2007 Handout #6. Problem Set 2 Solutions

Problem 7.7 : We assume that P (x i )=1/3, i =1, 2, 3. Then P (y 1 )= 1 ((1 p)+p) = P (y j )=1/3, j=2, 3. Hence : and similarly.

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220

On the Duality between Multiple-Access Codes and Computation Codes

a) Find the compact (i.e. smallest) basis set required to ensure sufficient statistics.

Upper and Lower Bounds on the Capacity of Wireless Optical Intensity Channels Corrupted by Gaussian Noise

Lecture 3: Channel Capacity

Multiple-Input Multiple-Output Systems

Advanced Topics in Digital Communications Spezielle Methoden der digitalen Datenübertragung

Capacity Bounds for Power- and Band-Limited Wireless Infrared Channels Corrupted by Gaussian Noise

Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT. ECE 559 Presentation Hoa Pham Dec 3, 2007

Threshold Optimization for Capacity-Achieving Discrete Input One-Bit Output Quantization

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

UNIT I INFORMATION THEORY. I k log 2

4 An Introduction to Channel Coding and Decoding over BSC

Principles of Coded Modulation. Georg Böcherer

Projects in Wireless Communication Lecture 1

On Network Interference Management

An analysis of the computational complexity of sequential decoding of specific tree codes over Gaussian channels

Transcription:

Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University

Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400 400 500 500 600 600 700 700 800 800 900 900 1000 1000 1100 1100 200 400 600 Dr. Yao Xie, ECE587, Information Theory, Duke University 200 400 600 1

Gaussian channel the most important continuous alphabet channel: AWGN Y i = X i + Z i, noise Z i N (0, N), independent of X i model for communication channels: satellite links, wireless phone Z i X i Y i Dr. Yao Xie, ECE587, Information Theory, Duke University 2

Channel capacity of AWGN intuition: C = log number of distinguishable inputs if N = 0, C = if no power constraint on the input, C = to make it more meaningful, impose average power constraint: for any codewords (x 1,..., x n ) 1 n x 2 i P n i=1 Dr. Yao Xie, ECE587, Information Theory, Duke University 3

Naive way of using Gaussian channel Binary phase-shift keying (BPSK) transmit 1 bit over the channel 1 + P, 0 P Y = ± P + Z Probability of error P e = 1 Φ( P/N) normal cumulative probability function (CDF): Φ(x) = x convert Gaussian channel into a discrete BSC with p = P e. Lose information in quantization 1 2π e t2 2 dt Dr. Yao Xie, ECE587, Information Theory, Duke University 4

Definition: Gaussian channel capacity C = max I(X; Y ) f(x):ex 2 P we can calculate from here C = 1 ( 2 log 1 + P ) N maximum attained when X N (0, P ) Dr. Yao Xie, ECE587, Information Theory, Duke University 5

C as maximum data rate we can also show this C is the supremum of rate achievable for AWGN definition: a rate R is achievable for Gaussian channel with power constraint P : if there exists a (2 nr, n) codes with maximum probability of error λ n = max 2nR i=1 λ i 0 as n. Dr. Yao Xie, ECE587, Information Theory, Duke University 6

Sphere packing why we may construct (2 nc, n) codes with a small probability of error? Fix one codeword consider any codeword of length n received vector N (true codeword, N) with high probability, received vector contained in a sphere of radius n(n + ϵ) around true codeword assign each ball a codeword Dr. Yao Xie, ECE587, Information Theory, Duke University 7

Consider all codewords with power constraint, with high probability the space of received vector is a sphere with radius n(p + N) volume of n-dimensional sphere = C n r n for constant C n and radius r n how many codewords can we pack in a total power sphere? C n (n(p + N)) n/2 C n (nn) n/2 = ( 1 + P ) n/2 N rate of this codebook = log 2 (size of the codewords)/n: = 1 2 log 2 ( 1 + P ) N Dr. Yao Xie, ECE587, Information Theory, Duke University 8

sphere packing Dr. Yao Xie, ECE587, Information Theory, Duke University 9

Gaussian channel capacity theorem Theorem. The capacity of a Gaussian channel with power constraint P and noise variance N is C = 1 ( 2 log 1 + P ) N bits per transmission Proof: 1) achievability; 2) converse Dr. Yao Xie, ECE587, Information Theory, Duke University 10

Achievability: New stuff in proof codeword elements generated i.i.d. according X j (i) N (0, P ϵ). So 1 n X2 i P ϵ power outage error E 0 = 1 n n Xj 2 (1) > P j=1 small according to law of large number Converse: Gaussian distribution has maximum entropy Dr. Yao Xie, ECE587, Information Theory, Duke University 11

Band-limited channels more realistic channel model: band-limited continuous AWGN *: convolution Y (t) = (X(t) + Z(t)) h(t) Nyquist-Shannon sampling theorem: sampling a band-limited signal at sampling rate 1/(2W ) is sufficient to reconstruct the signal from samples two samples per period Dr. Yao Xie, ECE587, Information Theory, Duke University 12

Capacity of continuous-time band-limited AWGN noise has power spectral density N 0 /2 watts/hertz, bandwidth W hertz, noise power = N 0 W signal power P watts 2W samples each second channel capacity C = W log ( 1 + P ) N 0 W bits per second when W, C P N 0 log 2 e bits per second Dr. Yao Xie, ECE587, Information Theory, Duke University 13

Telephone line telephone signal are band-limited to 3300 Hz SNR = 33 db: P/(N 0 W ) = 2000 capacity = 36 kb/s practical modems achieve transmission rates up to 33.6 kb/s uplink and downlink ADSL achieves 56 kb/s downlink (asymmetric data rate) Dr. Yao Xie, ECE587, Information Theory, Duke University 14

Summary additive white Gaussian noise (AWGN) channel noise power: N, signal power constraint P, capacity C = 1 ( 2 log 1 + P ) N band-limited channel with bandwidth = W C = W log ( 1 + P ) N 0 W Dr. Yao Xie, ECE587, Information Theory, Duke University 15