Gaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26

Similar documents
LECTURE 18. Lecture outline Gaussian channels: parallel colored noise inter-symbol interference general case: multiple inputs and outputs

Lecture 18: Gaussian Channel

Chapter 9. Gaussian Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Lecture 2. Capacity of the Gaussian channel

Lecture 14 February 28

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel

Principles of Communications

Chapter 9 Fundamental Limits in Information Theory

ELEC546 Review of Information Theory

Mathematical methods in communication June 16th, Lecture 12

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye

Shannon s noisy-channel theorem

Lecture 6 Channel Coding over Continuous Channels

Lecture 22: Final Review

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Communication Theory II

Lecture 15: Thu Feb 28, 2019

Revision of Lecture 5

Lecture 8: Channel Capacity, Continuous Random Variables

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications


EE 4TM4: Digital Communications II Scalar Gaussian Channel

Physical Layer and Coding

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

CSCI 2570 Introduction to Nanocomputing

Chapter 4: Continuous channel and its capacity

ECE 4400:693 - Information Theory

(Classical) Information Theory III: Noisy channel coding

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Probability and Statistics for Final Year Engineering Students

Lecture 17: Differential Entropy

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

Parallel Additive Gaussian Channels

Note that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )).

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes

Limits on classical communication from quantum entropy power inequalities

ECE Information theory Final (Fall 2008)

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

Capacity Bounds for. the Gaussian Interference Channel

LECTURE 13. Last time: Lecture outline

Revision of Lecture 4

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

ECE Information theory Final

X 1 : X Table 1: Y = X X 2

3F1 Information Theory, Lecture 1

Chapter 6. Continuous Sources and Channels. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Lecture 2: August 31

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Channel Coding I. Exercises SS 2017

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

16.36 Communication Systems Engineering

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Chapter I: Fundamental Information Theory

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

ECE 534 Information Theory - Midterm 2

Digital Image Processing

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

One Lesson of Information Theory

The Capacity Region of the Gaussian MIMO Broadcast Channel

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Application of Matched Filter

2. Matrix Algebra and Random Vectors

Advanced Topics in Digital Communications Spezielle Methoden der digitalen Datenübertragung

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

Discrete Memoryless Channels with Memoryless Output Sequences

ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Lecture 7 MIMO Communica2ons

PART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

Exercise 1. = P(y a 1)P(a 1 )

12.4 Known Channel (Water-Filling Solution)

Endogenous Information Choice

National Sun Yat-Sen University CSE Course: Information Theory. Maximum Entropy and Spectral Estimation

Ch. 8 Math Preliminaries for Lossy Coding. 8.5 Rate-Distortion Theory

Estimation of the Capacity of Multipath Infrared Channels

= 4. e t/a dt (2) = 4ae t/a. = 4a a = 1 4. (4) + a 2 e +j2πft 2

Lecture 3: Channel Capacity

Information Theory - Entropy. Figure 3

RADIO SYSTEMS ETIN15. Lecture no: Equalization. Ove Edfors, Department of Electrical and Information Technology

Shannon s Noisy-Channel Coding Theorem

Appendix B Information theory from first principles

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Lecture 18: Optimization Programming

Exercises with solutions (Set D)

On the Feedback Capacity of Stationary Gaussian Channels

arxiv: v1 [cs.it] 20 Jan 2018

Convergence of Square Root Ensemble Kalman Filters in the Large Ensemble Limit

The Poisson Channel with Side Information

Massachusetts Institute of Technology

Transcription:

Gaussian channel Information theory 2013, lecture 6 Jens Sjölund 8 May 2013 Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26

Outline 1 Definitions 2 The coding theorem for Gaussian channel 3 Bandlimited channels 4 Parallel Gaussian channels 5 Channels with colored Gaussian noise 6 Gaussian channels with feedback Jens Sjölund (IMT, LiU) Gaussian channel 2 / 26

Introduction Definition A Gaussian channel is a time-discrete channel with output Y i, input X i and noise Z i at time i such that Y i = X i + Z i, Z i N (0,N) where Z i is i.i.d and independent of X i. The Gaussian channel is the most important continuous alphabet channel, modeling a wide range of communication channels. The validity of this follows from the central limit theorem Jens Sjölund (IMT, LiU) Gaussian channel 3 / 26

The power constraint If the noise variance is zero or the input is unconstrained, the capacity of the channel is infinite. The most common limitation on the input is a power constraint EX 2 i P Jens Sjölund (IMT, LiU) Gaussian channel 4 / 26

Information capacity Theorem The information capacity of a Gaussian channel with power constraint P and noise variance N is C = max I(X;Y ) = 1 (1 f (x):ex 2 P 2 log + P ) N Proof. Use that EY 2 = P + N = h(y ) 1 2 log2πe(p + N) (Theorem 8.6.5) I(X;Y ) = h(y ) h(y X) = h(y ) h(x + Z X) = h(y ) h(z X) = h(y ) h(z) 1 2 log2πe(p + N) 1 2 log2πen = 1 ( ) P + N 2 log = 1 ( N 2 log 1 + P ) N which holds with equality iff X N (0,P ) Jens Sjölund (IMT, LiU) Gaussian channel 5 / 26

Outline 1 Definitions 2 The coding theorem for Gaussian channel 3 Bandlimited channels 4 Parallel Gaussian channels 5 Channels with colored Gaussian noise 6 Gaussian channels with feedback Jens Sjölund (IMT, LiU) Gaussian channel 6 / 26

Statement of the theorem Theorem (9.1.1) The capacity of a Gaussian channel with power constraint P and noise variance N is C = 1 ( 2 log 1 + P ) N Proof. See the book. Based on random codes and joint typicality decoding (similar to the proof for the discrete case). Jens Sjölund (IMT, LiU) Gaussian channel 7 / 26

Handwaving motivation Volume of n-dimensional sphere r n n Radius of received vectors i=1 Y i 2 n(p + N) Using decoding spheres of radius nn the maximum number of nonintersecting decoding spheres is therefore no more than ( ) n n(p + N ( M = ( ) n = 1 + P ) n/2 = 2 n 2 log(1+ P N ) nn N (note the typo in eq. 9.22 in the book). This corresponds to a rate R = logm log (2 n 2 log(1+ P N ) ) = = 1 ( n n 2 log 1 + P ) = C N Jens Sjölund (IMT, LiU) Gaussian channel 8 / 26

Outline 1 Definitions 2 The coding theorem for Gaussian channel 3 Bandlimited channels 4 Parallel Gaussian channels 5 Channels with colored Gaussian noise 6 Gaussian channels with feedback Jens Sjölund (IMT, LiU) Gaussian channel 9 / 26

Bandlimited channels A bandlimited channel with white noise can be described as Y (t) = (X(t) + Z(t)) h(t) where X(t) is the signal waveform, Z(t) is the waveform of the white Gaussian noise and h(t) is the impulse response of an ideal bandpass filter, which cuts all frequencies greater than W. Theorem (Nyquist-Shannon sampling theorem) Suppose that a function f (t) is bandlimited to W. Then, the function is completely determined by samples of the function spaced 1 2W apart. Jens Sjölund (IMT, LiU) Gaussian channel 10 / 26

Bandlimited channels Sampling at 2W Hertz, the energy per sample is P /2W. Assuming a noise spectral density N 0 /2 the capacity is C = 1 ( 2 log 1 + P /2W ) = 12 ( N 0 /2 log 1 + P ) bits per sample N 0 W ( = W log 1 + P ) bits per second N 0 W which is arguably one of the most famous formulas of information theory. Jens Sjölund (IMT, LiU) Gaussian channel 11 / 26

Outline 1 Definitions 2 The coding theorem for Gaussian channel 3 Bandlimited channels 4 Parallel Gaussian channels 5 Channels with colored Gaussian noise 6 Gaussian channels with feedback Jens Sjölund (IMT, LiU) Gaussian channel 12 / 26

Parallel Gaussian channels Consider k independent Gaussian channels in parallel with a common power constraint Y j = X j + Z j, Z j N (0,N j ) j = 1,2,...,k E k Xj 2 P j=1 The objective is to distribute the total power among channels so as to maximize the capacity. Jens Sjölund (IMT, LiU) Gaussian channel 13 / 26

Information capacity of parallel Gaussian channel Following the same lines of reasoning as in the single channel case one may show ( I(X 1,...,X k ;Y 1,...,Y k ) h(yj ) h(z j ) ) ( 1 2 log 1 + P ) j N j where P j = EX 2 j and P j = P. Equality is achieved by j P 1 0... 0 0 P 2... 0 (X 1,...,X k ) N 0,...... 0 0... P k j Jens Sjölund (IMT, LiU) Gaussian channel 14 / 26

Maximizing the information capacity The objective is thus to maximize the capacity subject to the constraint Pj = P, which can be done a Lagrange multiplier λ J(P 1,...,P k ) = j 1 2 log ( 1 + P ) j N j J 1 = P i 2(P i + N i ) + λ = 0 P i = 1 2λ N i = ν N i + λ P j P however P i 0, so the solution needs to modified to P i = (ν N i ) + ν N = i if ν N i 0 if ν < N i j Jens Sjölund (IMT, LiU) Gaussian channel 15 / 26

Water-filling The gives the solution C = k 1 (1 2 log + (ν N j) + ) j=1 where ν is chosen so that k j=1 (ν N j ) + = P (typo in the summary). As the signal power increases, power is alloted to the channels where the sum of noise and power is the lowest. By analogy to the way water distributes itself in a vessel, this is referred to as water-filling. N j Jens Sjölund (IMT, LiU) Gaussian channel 16 / 26

Outline 1 Definitions 2 The coding theorem for Gaussian channel 3 Bandlimited channels 4 Parallel Gaussian channels 5 Channels with colored Gaussian noise 6 Gaussian channels with feedback Jens Sjölund (IMT, LiU) Gaussian channel 17 / 26

Channels with colored Gaussian noise Consider parallel channels with dependent noise. Y i = X i + Z i, Z n N (0,K Z ) This can also represent the case of a single channel with Gaussian noise with memory. For a channel with memory, a block of n consecutive uses can be considered as n channels in parallel with dependent noise. Jens Sjölund (IMT, LiU) Gaussian channel 18 / 26

Maximizing the information capacity (1/3) Let K X be the input covariance matrix. The power constraint can then be written 1 EX 2 n i = 1 n tr(k X) P As for the independent channels, i I(X 1,...,X k ;Y 1,...,Y k ) = h(y 1,...,Y k ) h(z 1,...,Z k ) The entropy of the output is maximal when Y is normally distributed, and since the input and noise are independent K Y = K X + K Z = h(y 1,...,Y k ) = 1 2 log((2πe)n K X + K Z ) Jens Sjölund (IMT, LiU) Gaussian channel 19 / 26

Maximizing the information capacity (2/3) Thus, maximizing the capacity amounts to maximizing K X + K Z subject to the constraint tr(k X ) np. The eigenvalue decomposition of K Z gives K Z = QΛQ t, where QQ t = I then, K X + K Z = Q Q t K X Q + Λ Q t = Q t K X Q + Λ = A + Λ and tr(a) = tr(q t K X Q) = tr(q t QK X ) = tr(k X ) np Jens Sjölund (IMT, LiU) Gaussian channel 20 / 26

Maximizing the information capacity (3/3) From Hadamard s inequality (Theorem 17.9.2) A + Λ (A ii + λ i ) with equality iff A is diagonal. Since A ii np, A ii 0 i it can be shown using the KKT conditions that the optimal solution is given by i A ii = (ν λ i ) + (ν λ i ) + = np i Jens Sjölund (IMT, LiU) Gaussian channel 21 / 26

Resulting information capacity For a channel with colored Gaussian noise C = 1 n ( 1 n 2 log 1 + (ν λ i) + ) λ i i=1 where λ i are the eigenvalues of K Z and ν is chosen so that ni=1 (ν λ i ) + = np (typo in the summary). Jens Sjölund (IMT, LiU) Gaussian channel 22 / 26

Outline 1 Definitions 2 The coding theorem for Gaussian channel 3 Bandlimited channels 4 Parallel Gaussian channels 5 Channels with colored Gaussian noise 6 Gaussian channels with feedback Jens Sjölund (IMT, LiU) Gaussian channel 23 / 26

Gaussian channels with feedback Y i = X i + Z i, Z i N (0,K (n) Z ) Because of the feedback, X n and Z n are not independent; X i is casually dependent on the past values of Z. Jens Sjölund (IMT, LiU) Gaussian channel 24 / 26

Capacity with and without feedback For memoryless Gaussian channels, feedback does not increase capacity. However, for channels with memory, it does. Capacity without feedback ( ) 1 C n = max 2n log KX + K Z K Z tr(k X ) np Capacity with feedback C n,fb = max tr(k X ) np = 1 2n (λ λ(n) log 1 + ( ) 1 2n log KX+Z K Z i λ (n) i i ) + Jens Sjölund (IMT, LiU) Gaussian channel 25 / 26

Bounds on the capacity with feedback Theorem C n,fb C n + 1 2 Theorem (Pinsker s statement) C n,fb 2C n In words, Gaussian channel capacity is not increased by more than half a bit or by more than a factor 2 when we have feedback. Jens Sjölund (IMT, LiU) Gaussian channel 26 / 26