ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE School of Computer and Communcaton Scences Handout 0 Prncples of Dgtal Communcatons Solutons to Problem Set 4 Mar. 6, 08 Soluton. If H = 0, we have Y = Z Z = Y Z, and f H =, we have Y = Z Z = Y Z. Therefore, Y = Y Z n all cases. Now snce Z s ndependent of H, we clearl have H Y (Y, Y Z ). Hence, Y s a suffcent statstc. Soluton. (a) The MAP decoder Ĥ() s gven b Ĥ() = arg max P Y H ( ) = T (Y ) takes two values wth the condtonal probabltes { 0 f = 0 or = f = or = 3. P T H (t 0) = { 0.7 f t = 0 0.3 f t = P T H (t ) = { 0.3 f t = 0 0.7 f t =. Therefore, the MAP decoder Ĥ(T ()) s Ĥ(T ()) = arg max P T (Y ) H (t ) = Hence, the two decoders are equvalent. (b) We have { 0 f t = 0 ( = 0 or = ) f t = ( = or = 3). and Pr{Y = 0 T (Y ) = 0, H = 0} = Pr{Y = 0 T (Y ) = 0, H = } = Pr{Y = 0, T (Y ) = 0 H = 0} Pr{T (Y ) = 0 H = 0} Pr{Y = 0, T (Y ) = 0 H = } Pr{T (Y ) = 0 H = } = 0.4 0.7 = 4 7 = 0. 0.3 = 3. Thus Pr{Y = 0 T (Y ) = 0, H = 0} = Pr{Y = 0 T (Y ) = 0, H = }, hence H T (Y ) Y s not true, although the MAP decoders are equvalent. Soluton 3. (a) The MAP decson rule can alwas be wrtten as Ĥ() = arg max f Y H ( )P H () = arg max g (T ())h()p H () = arg max g (T ())P H (). The last step s vald because h() s a non-negatve constant whch s ndependent of and thus does not gve an further nformaton for our decson.
(b) Let us defne the event B = { : T () = t}. Then, f Y H,T (Y ) (, t) = f Y,T (Y ) H(, t )P H () f T (Y ) H (t )P H () Pr{Y =, T (Y ) = t H = } = Pr{T (Y ) = t H = } = f Y H( ) B () B f Y H( )d. = Pr{Y =, Y B H = } Pr{Y B H = } If f Y H ( ) = g (T ())h(), then f Y H,T (Y ) (, t) = g (T ())h() B () g B (T ())h()d = g (t)h() B () g (t) h()d B = h() B() B h()d. Hence, we see that f Y H,T (Y ) (, t) does not depend on, so H T (Y ) Y. (c) Note that P Yk H( ) = p, P Yk H(0 ) = p and Thus, we have where t = k k. P Y,...,Y n H(,..., n ) = P Y H( ) P Yn H( n ). P Y,...,Y n H(,..., n ) = p t ( p ) (n t), Choosng g (t) = p t ( p ) (n t) and h() =, we see that P Y,...,Y n H(,..., n ) fulflls the condton n the queston. (d) Because Y,..., Y n are ndependent, f Y,...,Y n H(,..., n ) = n k= e ( k m ) π = e n ( k m ) (π) n k= = (π) n n e k= k e nm ( n n k= k m ). Choosng g (t) = e nm (t m nk= ) and h(,..., n ) = (π) n e k, we see that f Y,...,Y n H(,..., n ) = g (T (,..., n ))h(,..., n ). Hence the condton n the queston s fulflled. Soluton 4.
(a) Wth the observaton Y beng Y, Thus the MAP rule s f Y X ( +) = π e ( ) and f Y X ( ) = π e (+) π e ( ) p whch can be further smplfed to obtan (b) Observe that π e (+) ( p), ln p p f Y Y X(, +) = { [0, ]} π e ( ) f Y Y X(, ) = 4 { [ 3, ]} π e ( +) Wth g + (u, ) = {u 0} e ( ) π g (u, ) = {u 0} e ( +) 4 π h(, ) = { 3 }, we fnd f Y Y X(, x) = g x (u, )h(, ) and the Fsher Neman theorem lets us conclude that t = (u, ) s a suffcent statstc. (c) The MAP rule mnmzes the error probablt and s gven b the lkelhood rato test Note that Λ(, ) = log f Y Y X(, +) f Y Y X(, ) log p p + < Λ(, ) = + log 0 3 < 0 So the decson regon looks as follows (wth θ = log p p ): 3
Decde + θ -3 - - Decde (d) When s sent an error wll happen ether when > or when 0 and θ. The frst of these cannot happen, and the second happens wth probablt Q( + θ). 4 When + s sent an error wll happen ether when < 0 or when 0 and θ. The frst of these cannot happen, and the second happens wth probablt Q( θ). So the error probablt s gven b wth θ = Soluton 5. log p p. p 4 Q( + θ) + p Q( θ) (a) Inequalt (a) follows from the Bhattachara Bound. Usng the defnton of DMC, t s straghtforward to see that P Y X ( c 0 ) = P Y X ( c ) = n P Y X ( c 0, ) = n P Y X ( c, ). = and (b) follows b substtutng the above values n (a). Equalt (c) s obtaned b observng that s the same as,..., n (the frst one beng a vector notaton for the sum over all possble,..., n ). In (c), we see that we want the sum of all possble products. Ths s the same as summng over each and takng the product of the resultng sum for all. Ths results n equalt (d). We obtan (e) b wrtng (d) n a more concse form. When c 0, = c,, P Y X ( c 0, )P Y X ( c, ) = P Y X ( c 0, ). Therefore, P Y X ( c 0, )P Y X ( c, ) = P Y X ( c 0, ) =. 4
Ths does not affect the product, so we are onl nterested n the terms where c 0, c,. We form the product of all such sums where c 0, c,. We then look out for terms where c 0, = a and c, = b, a b, and rase the sum to the approprate power. (Eg. If we have the product prpqrpqrr, we would wrte t as p 3 q r 4 ). Hence equalt (f). (b) For a bnar nput channel, we have onl two source smbols X = {a, b}. Thus, (c) The value of z s: P e z n(a,b) z n(b,a) = z n(a,b)+n(b,a) = z d H(c 0,c ). () For a bnar nput Gaussan channel, z = f Y X ( 0)f Y X ( ) d = ( exp E ). () For the Bnar Smmetrc Channel (BSC), z = Pr{ = 0 x = 0} Pr{ = 0 x = } + Pr{ = x = 0} Pr{ = x = } = δ( δ). () For the Bnar Erasure Channel (BEC), Soluton 6. z = Pr{ = 0 x = 0} Pr{ = 0 x = } + Pr{ = E x = 0} Pr{ = E x = } + Pr{ = x = 0} Pr{ = x = } = 0 + δ + 0 = δ. B smmetr: P 00 = Pr{(N a) (N a)} = Pr{(N a)} Pr{(N a)} )] = Q. P 0 = P 03 = Pr{(N (b a)) (N a)} = Pr{N b a} Pr{N a} ( ) b a )] = Q Q. P 0 = Pr{(N (b a)) (N (b a))} = Pr{N b a} Pr{N b a} [ ( )] b a = Q. 5
Equvalentl, P 0δ = Pr{(Y R 0 ) (Y R ) (Y R ) (Y R 3 ) c 0 was sent} = P 00 P 0 P 0 P 03 )] ( ) b a )] [ ( )] b a = Q Q Q Q ) ( )] b a = Q + Q. P 0δ = Pr{(N [a, b a]) (N [a, b a])} = Pr{N [a, b a]} + Pr{N [a, b a]} Pr{(N [a, b a]) (N [a, b a])} ) ( )] [ b a ( a ) ( )] b a = Q Q Q Q, whch gves the same result as before. 6