Chapter 5 Solutions Problem 5. Since X = X X 2 X 3 takes on the values,,, with equal probability, it follows that P Xi () = P Xi () = /2, i =, 2, 3. Furthermore, P X X 2 () = P X X 2 () = P X X 2 () = P X X 2 () = /4. (a) H(X )= P X () log P X () P X () log P X () = log log = log =log2=bit 2 2 2 2 2 (b) H(X X 2 )= P X X 2 () log P X X 2 () P X X 2 () log P X X 2 () P X X 2 () log P X X 2 () P X X 2 () log P X X 2 () = log log log log 4 4 4 4 4 4 4 4 = log =log4=2bits 4 (c) H(X 2 X )= P X X 2 () log P X2 X ( ) P X X 2 () log P X2 X ( ) P X X 2 () log P X2 X ( ) P X X 2 () log P X2 X ( ) But P X2 X ( ) = P X2 X ( ) = P X2 X ( ) = P X2 X ( ) = /2 Hence, we have H(X 2 X )= log log log log 4 2 4 2 4 2 4 2 = log =log2=bit 2 Alternatively, we use the formula H(X X 2 )=H(X )+H(X 2 X ) or, equivalently, H(X 2 X )=H(X X 2 ) H(X ) =2 =bit (d) H(X X 2 X 3 )= P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () Insert P X X 2 X 3 ()=P X X 2 X 3 ()=P X X 2 X 3 ()=P X X 2 X 3 () = /4 and P X X 2 X 3 () = P X X 2 X 3 () = P X X 2 X 3 () = P X X 2 X 3 () = Hence,
H(X X 2 X 3 )= log log log log 4 4 4 4 log log log log 4 4 4 4 = log =log4=2bits 4 Alternatively, we have 4 possible outcomes and they are equiprobable; that is, H(X X 2 X 3 )=logl where L =4. Hence, H(X X 2 X 3 )=log4=2bits Again alternatively, H(X X 2 X 3 )=H(X X 2 )+H(X 3 X X 2 ) What is H(X 3 X X 2 )? That is, what is the uncertainty about X 3 when we know X X 2? For all four outcomes,,, weseethatifweknow the first two binary digits, then we also know the third one; in other words, the uncertainty about X 3 is, if we know X X 2!Thatis, H(X 3 X X 2 )= and, hence, H(X X 2 X 3 )=H(X X 2 )=2bits (e) See (d), the last alternative! (f) I(X ; X 3 )=H(X ) H(X X 3 ) What is H(X X 3 ), that is, the uncertainty about X when we know X 3?If X 3 =, then we have two possibilities for X, namely and and they are equiprobable! The same thing holds for X 3 =. Hence, we conclude that H(X X 3 )=bit Thus, we have I(X ; X 3 )=H(X ) H(X X 3 ) = =bit We get no information about X by observing X 3! Regardless if X 3 is or, it is still 5 5 for X to be or. (g) I(X X 2 ; X 3 )=H(X 3 ) H(X 3 X X 2 ) = =bit If we know nothing about X X 2, then our uncertainty about X 3 is bit; 5 5 to be or! But if we know X X 2, then we also know X 3! Hence, then our uncertainty about X 3 is. Thus, by observing X X 2 we get bit of information about X 3. Alternatively, if we do not know X 3, then our uncertainty about X X 2 is 2 bits (cf. (b)). But if we know X 3,regardlessifitisoch,wehaveonly 2
two possibilities for X X 2, that is, our uncertainty about X X 2 is bit. By observing X 3, the uncertainty about X X 2 is reduced from 2 bits to bit; we get bit of information about X X 2 by observing X 3! Problem 5.2 Let X denote the coin, that is, P X (Fair )=P X (Counterfeit) =/2. Let Y be the number of Heads. Consider the following scheme: /4 8 Y = X = Fair 2 /2 X = Counterfeit 2 /4 4 Y = 5 8 Y =2 If X = Counterfeit, then we know for sure that we will get Y =2Heads when we flip the coin twice. Hence, we denote the branch between X = Counterfeit and Y = 2 by the conditional probability P Y X (2 Counterfeit) =. If X = Fair, then we get the combinations Tail-Tail, Tail-Head, Head-Tail, Head- Head with equal probability when we flip the fair coin twice. That is, Y =(Tail-Tail) occurs with probability /4, Y = (Tail-Head or Head-Tail) occurs with probability /2 (half of the four possibilities), and Y = 2(Head-Head) occurs with probability /4. Next we add the two probabilities for Y =2, 2 4 + 2 = 5 8,asshowninthe scheme above. Now we are well-prepared to compute I(X; Y )=H(Y ) H(Y X). H(Y )= 8 log 8 4 log 4 5 8 log 5 8 = 8 log 8 + 4 log 4 5 8 log 5 + 5 8 log 8 = 3 8 + 2 4 5 8 log 5 + 5 8 = 4 5 8 log 5 3
H(Y X) = P XY (Fair, ) log P Y X ( Fair) P XY (Fair, ) log P Y X ( Fair) P XY (Fair, 2) log P Y X (2 Fair) P XY (Counterfeit, ) log P Y X ( Counterfeit) P XY (Counterfeit, ) log P Y X ( Counterfeit) P XY (Counterfeit, 2) log P Y X (2 Counterfeit) Now we need P XY : X = Fair X = Counterfeit Y = 2 4 8 Y = 2 2 4 Y =2 = 2 4 8 2 = 2 Hence, we have H(Y X) = log log log 8 4 4 2 8 4 log log log 2 = log 4 + log 2 + log 4 = 3/4 8 4 8 Thus, I(X; Y )=H(Y) H(Y X) = 4 5 8 log 5 3 4 =2 5 log 5.549 8 Problem 5.3 Due to the symmetry of the BEC the maximizing input distribution is P X () = P X () = /2. Hence, we have and P Y () = ( δ) 2 P Y ( ) = δ P Y () = ( δ) 2 4
H(Y )= ( 2 δ)log(( δ)) δ log δ ( 2 2 δ)log( ( δ)) 2 = δ ( δ)log( δ) δ log δ = δ + h(δ) H(Y X) =h(δ) Then we have C BEC = H(Y ) H(Y X) = δ + h(δ) h(δ) = δ Problem 5.4..57.3.43.5.27 u.23 u 2.2 u 3.5 u 4. u 5.5 u 6 u x u u 2 u 3 u 4 u 5 u 6 W =. +.57 +.43 +.3 +.5 = 2.45 5
Problem 5.5.4.2 u.2 u 2 u x u u 2 u 3 u 4 u 5 u 6..6.35.25.2 u 3.5 u 4.5 u 5. u 6 W =. +.6 +.4 +.35 +.25 = 2.6 Problem 5.6.6.3.3 u.2 u 2. u 3 u x u u 2 u 3 u 4 u 5 u 6 u 7..4.2.2. u 4. u 5. u 6. u 7 W =. +.6 +.4 +.3 +.2 +.2 = 2.7 6
Problem 5.7 (a) THE FRIEND IN NEED IS A FRIEND INDEED Step Entry # binary digits T 8 2 H 8 3 E 9 4 _ 5 F 6 R 7 I 8 EN 3 9 N D 2 _I 4 2 IN 4 3 N_ 4 4 _N 4 5 NE 4 6 EE 4 7 ED 4 8 D_ 5 9 _IS 5 2 S 3 2 _A 5 22 A 3 23 _F 5 24 FR 5 25 RI 5 26 IE 5 27 END 5 28 D_I 5 29 IND 5 3 DE 5 3 EED 5 32 D 5 total bits=22 7
(b) THE CAT IN THE CAR ATE THE RAT Step Entry # binary digits T 8 2 H 8 3 E 9 4 _ 5 C 6 A 7 T_ 3 8 _I 3 9 I N 2 _T 4 2 TH 4 3 HE 4 4 E_ 4 5 _C 4 6 CA 4 7 AR 4 8 R 3 9 _A 5 2 AT 5 2 TE 5 22 E_T 5 23 THE 5 24 E_R 5 25 RA 5 26 AT 5 total bits=66 (c) EARLY TO BED AND EARLY TO RISE MAKES A MAN WISE 8
Step Entry # binary digits E 8 2 A 8 3 R 9 4 L 5 Y 6 _ 7 T 8 O 9 _B 3 B 2 ED 4 2 D 2 3 _A 4 4 AN 4 5 N 2 6 D_ 4 7 _E 4 8 EA 5 9 AR 5 2 RL 5 2 LY 5 22 Y_ 5 23 _T 5 24 TO 5 25 O_ 5 26 _R 5 27 RI 5 28 I 3 29 S 3 3 E_ 5 3 _M 5 32 M 3 33 AK 5 34 K 4 35 ES 6 36 S_ 6 37 _A_ 6 38 _MA 6 39 AN_ 6 4 _W 6 4 W 4 42 IS 6 43 SE 6 44 E 6 total bits=323 9
(d) IF WE CANNOT DO AS WE WOULD WE WOULD DO AS WE CAN Step Entry # binary digits I 8 2 F 8 3 _ 9 4 W 5 E 6 _C 3 7 C 8 A 9 N NO 2 O 2 2 T 2 3 _D 4 4 D 2 5 O_ 4 6 _A 4 7 AS 4 8 S 3 9 _W 5 2 WE 5 2 E_ 5 22 _WO 5 23 OU 5 24 U 3 25 L 3 26 D_ 5 27 _WE 5 28 E_W 5 29 WO 5 3 OUL 5 3 LD 5 32 D_D 5 33 DO 5 34 O_A 6 35 AS_ 6 36 _WE_ 6 37 _CA 6 38 AN 6 39 N 6 total bits=285
(e) BETTER LATE THAN NEVER BUT BETTER NEVER LATE Step Entry # binary digits B 8 2 E 8 3 T 9 4 TE 5 ER 2 6 R 7 _ 8 L 9 A TE_ 4 _T 4 2 TH 4 3 H 2 4 AN 4 5 N 2 6 _N 4 7 NE 4 8 EV 5 9 V 3 2 ER_ 5 2 _B 5 22 BU 5 23 U 3 24 T_ 5 25 _BE 5 26 ET 5 27 TT 5 28 TER 5 29 R_ 5 3 _NE 5 3 EVE 5 32 ER_L 5 33 LA 5 34 AT 6 35 TE 6 total bits=237
(f) WHO CHATTERS WITH YOU WILL CHATTER ABOUT YOU Step Entry # binary digits W 8 2 H 8 3 O 9 4 _ 5 C 6 HA 3 7 A 8 T 9 TE E 2 R 2 2 S 2 3 _W 4 4 WI 4 5 I 2 6 TH 4 7 H_ 4 8 _Y 5 9 Y 3 2 OU 5 2 U 3 22 _WI 5 23 IL 5 24 L 3 25 L_ 5 26 _C 5 27 CH 5 28 HAT 5 29 TT 5 3 TER 5 3 R_ 5 32 _A 5 33 AB 5 34 B 4 35 OUT 6 36 T_ 6 37 _YO 6 38 OU 6 total bits=287 2
Problem 5.8 (a) IF YOU CANNOT BE WITH THE ONE YOU LOVE LOVE THE ONE YOU ARE WITH Problem 5.9 B = {,,, } (a) Yes, since the sum of any two codewords is a codeword. (b) N =6 K =logm =log4=2 R =2/6 =/3 (c) (d) u v is a linear encoder. u v is a nonlinear encoder (cf. Ch. 2). (e) d min =4. (f) d min = 4 =.5 = error. 2 2 3
Problem 5. (a) / + / / / (b) (c) d free =3 (d) r = 2 2 2 2 3 2 3 v = û = () (e) ê = Three channel errors! 4
Problem 5.2 (a) r = 2 3 3 2 2 2 2 3 2 2 û = () (b) r = v = v = ê = e = We had four channel errors to start with and introduced a new one in the decoding process! Problem 5.3 (a) Look at the trellis in Fig. 5.24. Then we see that the minimum (squared) Euclidean distance can be obtained as (b) For BPSK we have d 2(c) E (QPSK) = d2 E (, ) + d2 E (, ) + d2 E (, ) =4+2+4= d 2 E (BPSK) = 4 Hence, we have the coding gain for our coded QPSK scheme over uncoded BPSK d 2(c) E γ =log (QPSK) d 2 E (BPSK) =log =3.98 db 4 5
(c) r = 2 4 4 6 6 6 2 6 6 4 2 4 2 8 6 6 2 (Notice that we use the Euclidean distance, not the Hamming distance!) û = () 6