ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a) Fnd the bnary Huffman code for ths source and determne the average number of bts needed for each source letter. (b) Suppose two letters at a tme are encoded nto a bnary sequence. Fnd the Huffman code and the average number of bts needed per source letter. Soluton: (a) One possble Huffman code s: C() =, C() =, and C(3) =. The average number of bts per source letter s.4 + (.4 +.) =.6. (b) One possble code constructon s: C() =, C() =, C(3) =, C() =, C() =, C(3) =, C(3) =, C(3) =, and C(33) =. The average number of bts per source letter s [3 + (.8 +.4)]/ =.56. Problem [ pts.] A source X produces letters from a three-symbol alphabet wth the probablty assgnment P X () = /4, P X () = /4, and P X () = /. Each source letter s transmtted through two channels smultaneously wth outputs y and z and the transton probabltes ndcated below: P(y ) y P(z ) z Calculate H(X), H(Y ), H(Z), H(Y,Z), I(X;Y ), I(X;Z), I(X;Y Z), and I(X;Y,Z). Soluton: H(X) = 4 log 4 + 4 log 4 + log =.5
The probablty dstrbuton of Y s P Y () = + =. Therefore, 4 H(Y ) = Smlarly, P Z () = P Z () = and H(Z) = If Z =, then X =, and Y =, wth equal probablty; f Z =, then X =, wth equal probablty and as a consequence, Y =, wth equal probablty. Therefore, H(Y Z) = P Z ()H(Y Z = ) + P Z ()H(Y Z = ) = and H(Y,Z) = H(Z) + H(Y Z) = + = Snce H(Y X = ) = H(Y X = ) = and H(Y X = ) =, we have I(X;Y ) = H(Y ) H(Y X) = =.5 Snce H(Z X) =, we have I(X;Z) = H(Z) H(Z X) = H(Z) = Snce Z s completely determned by X, H(Y X,Z) = H(Y X) =.5. We have I(X;Y Z) = H(Y Z) H(Y X,Z) =.5 =.5 and I(X;Y,Z) = I(X;Z) + I(X;Y Z) = +.5 =.5 Problem 3 [3 pts.] a) Prove that the number of elements n the δ-typcal set A(n,δ) satsfes A(n,δ) nh(x)+nδ. b) Prove that A(n,δ) ( δ) nh(x) nδ for suffcently large n. c) Prove that the epected length L of a D-ary pref code for a random varable X satsfes L log D H(X). (Hnt: use Kraft s Inequalty D l.) Soluton: a) A(n,δ) contans the set of sequences (,..., n ), such that n n = log P X ( H(X) δ. Therefore, ) every sequence n A(n, δ) satsfes: Usng the lower bound n (), we have and hence P[A(n,δ)] = nh(x) nδ P(,..., n ) nh(x)+nδ () (,..., n) A(n,δ) P(,..., n ) A(n,δ) nh(x) nδ A(n,δ) nh(x)+nδ
3 b) We have from the law of large numbers that P[A(n,δ)] δ for all suffcently large n. Usng the upper bound n (), we have for all suffcently large n, δ P[A(n,δ)] = P(,..., n ) A(n,δ) nh(x)+nδ and hence (,..., n) A(n,δ) A(n,δ) ( δ) nh(x) nδ c) Denotng the length of the codeword for a A as l a, we have L log D H(X) = a A l a log D a A log = log D la a A (log e) ( D l a ) a A ( ) = (log e) D la a where the frst nequalty comes from ln and the last nequalty comes from Kraft s nequalty. Problem 4 [3 pts.] Consder n dscrete memoryless channels wth capactes C,C,...,C n, respectvely. Both the nput and the output alphabet sets of dfferent channels are dsjont. We defne the sum channel of these n channels as a channel that has all n channels avalable for use but only one channel may be used at any gven tme. (a) Prove that the capacty of the sum channel s gven by n C = log C (b) Use the above result to fnd the capacty of the followng channel: = P(y ) ɛ ɛ ɛ ɛ y
4 Soluton: (a) Denote the nput and output of the th channel as X and Y. Denote the nput and output alphabets of the th channel as A and B, respectvely. Denote the nput and output of the sum channel as X and Y. Suppose the th channel s used wth probablty p, and the nput dstrbuton for the th channel s P X. The nput dstrbuton of the sum channel s: { p P X () A, =,...,n P X () =. otherwse The output dstrbuton s therefore { p A P Y (y) = P X ()P Y X (y ) y B, =,...,n otherwse Snce P Y (y) = A P X ()P Y X (y ), we have H(Y ) = y P Y (y) log P Y (y) y B p P Y (y) log p P Y (y). p log p + p H(Y ). The condtonal entropy s H(Y X) = P X ()H(Y X = ) p P X ()H(Y X = ) A p H(Y X ). The mutual nformaton can then be obtaned as I(X;Y ) = H(Y ) H(Y X) p log p + p I(X ;Y ). To mamze the mutual nformaton, we need to choose P X such that the capacty of the th channel s acheved, leadng to The Lagrangan of ths optmzaton problem s C = supi(x;y ) P X [ = sup p log + (p,...,p n) p p C ] L(p,...,p n,λ) p log p + p C + λ( p )
5 Solvng the followng equatons: L = log log e + C + λ =, p p L λ = p =, =,...,n we obtan the optmal values of (p,...,p n ) as p = C C, =,...,n. The channel capacty s thus C = log C (b) Ths channel can be decomposed nto the sum of a bnary symmetrc channel wth capacty h(ɛ) and a channel wth zero capacty. The capacty of the sum channel s thus C = log ( + h(ɛ))