Sol. Iformatio Theory ad Codig. The capacity of a bad-limited additive white Gaussia (AWGN) chael is give by C = Wlog 2 ( + σ 2 W ) bits per secod(bps), where W is the chael badwidth, is the average power received ad σ 2 is the oe-sided power spectral desity of the AWGN. For a fixed = 000, the chael capacity (i kbps) with ifiite σ2 badwidth (W ) is approximately (a).44 (b).08 C = W log 2 ( + σ 2 W ) (c) 0.72 (d) 0.36 [GATE 204: Mark] = σ2 σ2 W log 2 ( + σ 2 W ) = σ2 σ2 W log 2 ( + σ 2 W ) = lim σ2 [x log 2 ( + x )] where x = σ2 W = σ 2 log 2e =. 44 σ 2 =. 44 000 =. 44 kbps Optio (a)
2. A fair is tossed repeatedly util a Head appears for the first time. Let L be the umber of tosses to get this first Head. The etropy H(L) i bits is [GATE 204: 2 Marks] Sol. If toss is required to get first head, the probability = 2 If 2 tosses are required to get first head the 2 = 2 2 = 4 If 3 tosses are required to get first head the 3 = 2 2 2 = 8 Etropy H = i log 2 i= i = 2 log 22 + 4 log 24 + 8 log 28 + 6 log 26 = 2 + 2 2 2 + 3 2 3 + 4 2 4 2 Sol. 3. The capacity of a Biary Symmetric Chael (BSC) with cross-over probability 0.5 is [GATE 204: Mark] p (0) (0) X ( p) ( p) Y p () () Give cross over probability of 0.5
(x ) = 2 (x 2 ) = 2 Chael capacity for BSC (C) = log 2 [ p 2 j= ( y k xj ) log p ( y k xj )] log 2 2 + p log p + ( p) log( p) = + 2 log 2(/2) + 2 log 2(/2) = 2 2 = 0 C = 0 Capacity = 0 4. I a digital commuicatio system, trasmissio of successive bits through a oisy chael are assumed to be idepedet evets with error probability p. The probability of at most oe error i the trasmissio of a 8-bit sequece is (a) 7( p)/+p/8 (b) ( p) 8 + 8( p) 7 Sol. Gettig almost oe error be success robability of at most oe error = p Say, success Failure (X = at most error) = (X = 0) + (X = ) = p (c) ( p) 8 + ( p) 7 (d) ( p) 8 + p( p) 7 [GATE 988: 2 Marks]
Note that probability that evet A occurs r times is give by bioomical probability ma fuctio defied as (X = r) Cr p r ( p) r = 8 C0 (p) 0 ( p) 8 0 + 8 C (p) ( p) 8 = ( p) 8 + 8p ( p) 7 Optio (b) 5. Cosider a Biary Symmetric Chael (BSC) with probability of error beig p. To trasmit a bit say, we trasmit a sequece of three sequece to represet if at least two bits bit will be represet i error is (a) p 3 + 3p 2 ( p) (b) p 3 Sol. ( 0 ) = ( 0 ) = p ( ) = (0 0 ) = (c) ( p) 3 (d) p 3 + p 2 ( p) Receptio with error meas gettig at the most oe. (receptio with error) = (X = 0) + (X = ) Usig the relatio of Biomial probability ma fuctio (X = r) = Cr p r ( p) r For r = 0,, 2, -------- = 3 C0 ( p) 0 p 3 + 3 C ( p) p 2 = p 3 + 3p 2 ( p) Optio (a) [GATE 2008: 2 Marks] 6. Durig trasmissio over a certai biary commuicatio chael, bit errors occur idepedetly with probability p. The probability of at most oe bit i error i a block of bits is give by (a) p (b) p (c) p( p) + ( p) (d) ( p) [GATE 2007: 2 Marks]
Sol. robability of at most oe bit is error = (o error)+(oe bit error) Usig the relatio of Biomial probability ma fuctio = C0 (p) 0 ( p) + C (p) ( p) = ( p) + p( p) Note, C0 = ad C = Optio (c) 7. Let U ad V be two idepedet ad idepedet ad idetically distributed radom variables such that (U = +) = (U = ) = 2. The etropy H(U+V) i bits is (a) 3/4 (b) (c) 3/2 (d) log 2 3 [GATE 203: 2 Marks] Sol. U ad V are two idepedet ad idetically distributed radom variables (U = +) = (U = ) = 2 (V = +) = (V = ) = 2 So, radom variables U ad V ca have followig values U = +, ; V = +, 2 Whe U = V = U + V { 0 whe U =, V = 2 whe U = V = or U =, V =, U + V = 2 U + V = 0 U + V = 2 (U + V) = 2 = 2 2 = 4 (U + V) = 0 = 2 2 + 2 2 = 2 (U + V) = 2 = 2 2 = 4 Etropy of (U + V) = H(U + V)
= (U + V) log 2 (U+V) = 4 log 24 + 2 log 22 + 4 log 24 = 2 4 + 2 + 2 4 = 3 2 Optio (c) 8. A source alphabet cosists of N symbols with the probability of the first two symbols beig the same. A source ecoder icreases the probability of the first symbol by a small amout e. After ecodig, the etropy of the source (a) icreases (b) remais the same (c) icreases oly if N = 2 (d) decreases [GATE 202: Mark] Sol. Etropy is maximum, whe symbols are equally probable, whe probability chages from equal to o-equal, etropy decreases Optio (d) 9. A commuicatio chael with AWGN operatig at a sigal at a sigal to oise ratio SNR >> ad badwidth B has capacity C. If the SNR is doubled keepig B costat, the resultig capacity C 2 is give by (a) C 2 2C (b) C 2 C + B Sol. Whe SNR >>, chael capacity C C = B log 2 ( + S N ) C B log 2 ( S N ) Whe SRN is doubled C B log 2 ( 2S N ) = B log 22 + B log 2 ( S N ) (c) C 2 C + 2B (d) C 2 C + 0.3B [GATE 2009: 2 Marks]
C = B log 2 ( S N ) + B = C + B Optio (b) 0. A memoryless source emits symbols each with a probability p. The etropy of the source as a fuctio of (a) icreases (c) icreases as (b) decreases as log (d) icreases as log [GATE 2008: 2 Marks] Sol. Etropy H(m) for the memoryless source H(m) = i log 2 i bits i= i = robability of idividual symbol = 2 = = H(m) = log 2 i= = log 2 Etropy H(m) icreases as a fuctio of log 2 Optio (a). A source geerates three symbols with probability 0.25, 0.25, 0.50 at a rate of 3000 symbols per secod. Assumig idepedet geeratio of symbols, the most efficiet source ecoder would have average bit rate of (a) 6000 bits/sec (b) 4500 bits/sec (c) 3000 bits/sec (d) 500 bits/sec [GATE 2006: 2 Marks]
Sol. Three symbols with probability of 0.25, 0.25 ad 0.50 at the rate of 3000 symbols per secod. Etropy H = 0. 25 log 2 = 0. 25 2 + 0. 25 2 + 0. 5 Rate of iformatio R = r.h R = 3000. 5 = 4500 bits/sec Optio (b) 0. 25 + 0. 25 log 2 =. 5 R = 3000 symbol/sec 0. 25 + 0. 5 log 2 0. 5 2. A image uses 52 52 picture elemets. Each of the picture elemets ca take ay of the 8 distiguishable itesity levels. The maximum etropy i the above image will be (a) 209752 bits (b) 786432 bits Sol. For 8 distiguishable itesity levels = log 2 L = log 2 8 = 3 Maximum etropy = 52 52 = 52 52 3 = 786432 (c) 648 bits (d) 44 bits [GATE 990: 2 Marks] 3. A source produces 4 symbols with probability,, ad. For this 2 4 8 8 source, a practical codig scheme has a average codeword legth of 2 bits/symbols. The efficiecy the code is (a) (c) /2 (b) 7/8 (d) /4 [GATE 989: 2 Marks]
Sol. Four symbol with probability 2, 4, 8 ad 8 Etoropy = H = i log 2 ( i ) i= H = [ 2 log 2 ( 2 ) + 4 log 2 ( 4 ) + 8 log 2 ( 8 ) + 8 log 2 ( 8 )] = 2 + 4 2 + 8 3 + 8 3 = + 3 4 = 7 4 Code efficiecy H L = 7 4 2 Optio (b) = 7 8