Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1
1. Source entropy Given X a memoryless symbol source. The source alphabet : J different symbols x 0,x 1,...,x J 1 Each symbol is associated with an emission probability : p(x 0 ), p(x 1 ),..., p(x J 1 ) J 1 p(x j ) = 1 j=0 To each symbol, we associate its specific information : i(x j ) = log 2 p(x j ) The source entropy is then defined by : H(X) = J 1 p(x j ) log 2 p(x j ) j=0 expressed in bit/symbol. = average information per symbol ENTROPY UNCERTAINTY INFORMATION Exercise session 11 : Channel capacity 2
2. Discrete memoryless channel X Y x 0 y 0 x 1 y 1 Channel : p(y k x j ) x J 1 y K 1 The noise on the channel the source and destination alphabets might be differents. p(y k x j ) : transition probabilities. Exercise session 11 : Channel capacity 3
3. Mutual information We observe Y = y k. Which uncertainty remains on X? We define the entropy of X conditionally to Y = y k : J 1 H(X Y = y k ) = p(x j y k ) log 2 p(x j y k ) j=0 We take the average value of Y : H(X Y ) = K 1 k=0 = = p(y k )H(X Y = y k ) K 1 k=0 K 1 k=0 J 1 p(x j y k ) p(y k ) log 2 p(x j y k ) j=0 J 1 p(x j,y k ) log 2 p(x j y k ) j=0 The average mutual information is defined by I(X;Y ) = H(X) H(X Y ) Exercise session 11 : Channel capacity 4
I(X;Y ) = H(X) H(X Y ) Two particular cases : 1. Channel without noise : H(X Y ) = 0 I(X;Y ) = H(X) the channel convey only the useful information. 2. Very noisy channel : H(X Y ) = H(X) I(X;Y ) = 0 the channel doesn t convey any useful information. Remark : The mutual information is symetric I(X;Y ) = I(Y ;X) Exercise session 11 : Channel capacity 5
4. Channel capacity Definition : C s = max I(X;Y ) p(x j ) expressed in bit/symbol. If s = symbol transmission rate (symbol/s), C = sc s is the channel capacity in bit/s. Binary symmetric channel case p(x 0 ) = 1 α x 0 = 0 X 1 p e Y y 0 = 0 p e p e p(x 1 ) = α x 1 = 1 1 p e y 1 = 1 J = K = 2. The mutual information is given by I(X;Y ) = H(Y ) H(Y X) Exercise session 11 : Channel capacity 6
Computation of H(Y X) : H(Y X) = 1 k=0 independent of the p(x j ). 1 p(x j ) p(y k x j ) log 2 p(y k x j ) j=0 = (1 α)(1 p e ) log 2 (1 p e ) (1 α) p e log 2 p e α (1 p e ) log 2 (1 p e ) α p e log 2 p e = (1 p e ) log 2 (1 p e ) p e log 2 p e may be considered as a channel entropy. Therefore, and I(X;Y ) = H(Y ) + (1 p e ) log 2 (1 p e ) + p e log 2 p e C s = max I(X;Y ) p(x j ) = 1 + (1 p e ) log 2 (1 p e ) + p e log 2 p e NRZ baseband transmission case : p e = 1 ( ) 2 er f c Eb N 0 Exercise session 11 : Channel capacity 7
4 C s (bit/symbol) 16 states 3 8 states 2 4 states 1 2 states 0 0 10 20 30 Shannon Theorem Continuous inpout and output alphabets. Exemple : E b N 0 [db] Then C s = 1 2 log 2 where σ 2 X = input power. Y = X + N(0,σ 2 N) ( 1 + σ X 2 ) σn 2 [bit/symbole] If the channel bandwidth is equal to B, its capacity is given by ( ) σ 2 C = B log X 2 σn 2 [bit/second] (Shannon-Hartley relation). Exercise session 11 : Channel capacity 8
Information rate : R = sh(x) If R < C, we can find a source and channel encoding which give rise to a perfect transmission. Exercise session 11 : Channel capacity 9
5. Exercices 1. Determine the capacity of the discrete channel whose transition probabilities are given by 1 p 0 0 1 1 p p p 1 2 2. Two binary symetric transmission channel of error probability p are cascaded. Determine the global channel capacity. 3. We consider a channel with some white additive gaussian noise whose bandwith is equal to 4 khz and the noise power spectral density is equal to N 0 /2 = 10 12 W/Hz. The required signal power at the receiver is equal to 0,1 mw. Compute the channel capacity. 4. An analog signal with a bandwidth of 4 khz is sampled at 1,25 times the Nyquist frequency, each sample is quantized into 256 levels of equal probability. We assume that the samples are statistically independents. (a) What is the source information rate? (b) (c) (d) Is it possible to transmit without errors the signals from this source on a channel subject to a Gaussian additive white noise with a bandwidth of 10 khz and a signal to noise ratio of 20 db? Compute the required signal to noise to ensure a transmission without errors in the conditions edicted in (b). Compute the required bandwidth to transmit without errors the signals from the same source through a channel with a Gaussian additive white noise to ensure a signal to noise ratio of 20 db. Exercise session 11 : Channel capacity 10
5. The problem is to design a transmission system for packets comprising 1500 bytes. We impose the usage of a two states digital phase modulation (PSK-2) and that 99% of the packets be entirely corrects at the receiver (meaning that the packet error rate should be less than 1%). (a) If the noise density N 0 2 is 10 2 [W/Hz], what is the energy per bit E b? (b) Determine the maximum theoretical value of the channel capacity! (c) Determine the real value of the channel capacity in the conditions of this question! Remark : Error probability for a bipolar NRZ signal Exercise session 11 : Channel capacity 11
Answer 1. (1 p). 2. 1 + 2p(1 p)log 2 [2p(1 p)] + (1 2p + 2p 2 )log 2 (1 2p + 2p 2 ). 3. 54,44 kb/s. 4. (a) 80 kb/s. (b) C = 66,6 kb/s. It is not possible to have a transmission without errors. (c) 24,1 db. (d) 12 khz. 5. (a) E b = 0.252 [J] (b) C s,max = 1.88 [bits/symbol] (c) C s = 0.919 [bits/symbol] Exercise session 11 : Channel capacity 12