Chapter 5 Solutions. Problem 5.1

Similar documents
A L A BA M A L A W R E V IE W

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

Exercises with solutions (Set B)

Lecture 12. Block Diagram

6.02 Fall 2011 Lecture #9

02 Background Minimum background on probability. Random process

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

NORTHWESTERN UNIVERSITY Tuesday, Oct 6th, 2015 ANSWERS FALL 2015 NU PUTNAM SELECTION TEST

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Exercises with solutions (Set D)

Convolutional Codes ddd, Houshou Chen. May 28, 2012

An introduction to basic information theory. Hampus Wessman

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

One Lesson of Information Theory

Some Basic Concepts of Probability and Information Theory: Pt. 2

EE 229B ERROR CONTROL CODING Spring 2005

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Introduction to Information Theory

Noisy channel communication

Definition 2.1. Let w be a word. Then the coset C + w of w is the set {c + w : c C}.

CS 630 Basic Probability and Information Theory. Tim Campbell

Exercise 1. = P(y a 1)P(a 1 )

X 1 : X Table 1: Y = X X 2

Electrical Engineering Written PhD Qualifier Exam Spring 2014

Shannon s noisy-channel theorem

6.3 Bernoulli Trials Example Consider the following random experiments

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

HW Solution 3 Due: July 15

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

176 5 t h Fl oo r. 337 P o ly me r Ma te ri al s

And for polynomials with coefficients in F 2 = Z/2 Euclidean algorithm for gcd s Concept of equality mod M(x) Extended Euclid for inverses mod M(x)

Shannon's Theory of Communication

EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018

P a g e 3 6 of R e p o r t P B 4 / 0 9

Problem Set 7 Due March, 22

P a g e 5 1 of R e p o r t P B 4 / 0 9

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

Entropy as a measure of surprise

COMPSCI 650 Applied Information Theory Apr 5, Lecture 18. Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Information Theory (Information Theory by J. V. Stone, 2015)

Quantitative Biology Lecture 3

Solutions or answers to Final exam in Error Control Coding, October 24, G eqv = ( 1+D, 1+D + D 2)

Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift

T h e C S E T I P r o j e c t

Channel Coding I. Exercises SS 2017

Information. = more information was provided by the outcome in #2

A Gentle Tutorial on Information Theory and Learning. Roni Rosenfeld. Carnegie Mellon University

List Decoding: Geometrical Aspects and Performance Bounds

Gambling and Information Theory

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Upper Bounds on the Capacity of Binary Intermittent Communication

Cyclic codes. Vahid Meghdadi Reference: Error Correction Coding by Todd K. Moon. February 2008

1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g

Coding of memoryless sources 1/35

Lecture 1. ABC of Probability

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006

Lecture 12: November 6, 2017

Chapter 2. Matrix Arithmetic. Chapter 2

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code


Lecture 1: Shannon s Theorem

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes.

On the exact bit error probability for Viterbi decoding of convolutional codes

Low-Density Parity-Check Codes

INTRODUCTION TO INFORMATION THEORY

ELEC 519A Selected Topics in Digital Communications: Information Theory. Hamming Codes and Bounds on Codes

Probability: Sets, Sample Spaces, Events

Coding on a Trellis: Convolutional Codes

Chapter 2: Source coding

Intro to Information Theory

Solutions to Set #2 Data Compression, Huffman code and AEP

Error Detection and Correction: Hamming Code; Reed-Muller Code

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel Capacity, Continuous Random Variables

Advanced Topics in Information Theory

Probability: Part 1 Naima Hammoud

Digital Communications

1. Basics of Information

6.02 Fall 2012 Lecture #1

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

1 Introduction to information theory

MATH/MTHE 406 Homework Assignment 2 due date: October 17, 2016

Topics. Probability Theory. Perfect Secrecy. Information Theory

Trellis Coded Modulation

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

2018/5/3. YU Xiangyu

Section 3 Error Correcting Codes (ECC): Fundamentals

DISCRETE RANDOM VARIABLES: PMF s & CDF s [DEVORE 3.2]

Shannon s Noisy-Channel Coding Theorem

Hidden Markov Models 1

Introduction to Decision Sciences Lecture 11

Transcription:

Chapter 5 Solutions Problem 5. Since X = X X 2 X 3 takes on the values,,, with equal probability, it follows that P Xi () = P Xi () = /2, i =, 2, 3. Furthermore, P X X 2 () = P X X 2 () = P X X 2 () = P X X 2 () = /4. (a) H(X )= P X () log P X () P X () log P X () = log log = log =log2=bit 2 2 2 2 2 (b) H(X X 2 )= P X X 2 () log P X X 2 () P X X 2 () log P X X 2 () P X X 2 () log P X X 2 () P X X 2 () log P X X 2 () = log log log log 4 4 4 4 4 4 4 4 = log =log4=2bits 4 (c) H(X 2 X )= P X X 2 () log P X2 X ( ) P X X 2 () log P X2 X ( ) P X X 2 () log P X2 X ( ) P X X 2 () log P X2 X ( ) But P X2 X ( ) = P X2 X ( ) = P X2 X ( ) = P X2 X ( ) = /2 Hence, we have H(X 2 X )= log log log log 4 2 4 2 4 2 4 2 = log =log2=bit 2 Alternatively, we use the formula H(X X 2 )=H(X )+H(X 2 X ) or, equivalently, H(X 2 X )=H(X X 2 ) H(X ) =2 =bit (d) H(X X 2 X 3 )= P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () Insert P X X 2 X 3 ()=P X X 2 X 3 ()=P X X 2 X 3 ()=P X X 2 X 3 () = /4 and P X X 2 X 3 () = P X X 2 X 3 () = P X X 2 X 3 () = P X X 2 X 3 () = Hence,

H(X X 2 X 3 )= log log log log 4 4 4 4 log log log log 4 4 4 4 = log =log4=2bits 4 Alternatively, we have 4 possible outcomes and they are equiprobable; that is, H(X X 2 X 3 )=logl where L =4. Hence, H(X X 2 X 3 )=log4=2bits Again alternatively, H(X X 2 X 3 )=H(X X 2 )+H(X 3 X X 2 ) What is H(X 3 X X 2 )? That is, what is the uncertainty about X 3 when we know X X 2? For all four outcomes,,, weseethatifweknow the first two binary digits, then we also know the third one; in other words, the uncertainty about X 3 is, if we know X X 2!Thatis, H(X 3 X X 2 )= and, hence, H(X X 2 X 3 )=H(X X 2 )=2bits (e) See (d), the last alternative! (f) I(X ; X 3 )=H(X ) H(X X 3 ) What is H(X X 3 ), that is, the uncertainty about X when we know X 3?If X 3 =, then we have two possibilities for X, namely and and they are equiprobable! The same thing holds for X 3 =. Hence, we conclude that H(X X 3 )=bit Thus, we have I(X ; X 3 )=H(X ) H(X X 3 ) = =bit We get no information about X by observing X 3! Regardless if X 3 is or, it is still 5 5 for X to be or. (g) I(X X 2 ; X 3 )=H(X 3 ) H(X 3 X X 2 ) = =bit If we know nothing about X X 2, then our uncertainty about X 3 is bit; 5 5 to be or! But if we know X X 2, then we also know X 3! Hence, then our uncertainty about X 3 is. Thus, by observing X X 2 we get bit of information about X 3. Alternatively, if we do not know X 3, then our uncertainty about X X 2 is 2 bits (cf. (b)). But if we know X 3,regardlessifitisoch,wehaveonly 2

two possibilities for X X 2, that is, our uncertainty about X X 2 is bit. By observing X 3, the uncertainty about X X 2 is reduced from 2 bits to bit; we get bit of information about X X 2 by observing X 3! Problem 5.2 Let X denote the coin, that is, P X (Fair )=P X (Counterfeit) =/2. Let Y be the number of Heads. Consider the following scheme: /4 8 Y = X = Fair 2 /2 X = Counterfeit 2 /4 4 Y = 5 8 Y =2 If X = Counterfeit, then we know for sure that we will get Y =2Heads when we flip the coin twice. Hence, we denote the branch between X = Counterfeit and Y = 2 by the conditional probability P Y X (2 Counterfeit) =. If X = Fair, then we get the combinations Tail-Tail, Tail-Head, Head-Tail, Head- Head with equal probability when we flip the fair coin twice. That is, Y =(Tail-Tail) occurs with probability /4, Y = (Tail-Head or Head-Tail) occurs with probability /2 (half of the four possibilities), and Y = 2(Head-Head) occurs with probability /4. Next we add the two probabilities for Y =2, 2 4 + 2 = 5 8,asshowninthe scheme above. Now we are well-prepared to compute I(X; Y )=H(Y ) H(Y X). H(Y )= 8 log 8 4 log 4 5 8 log 5 8 = 8 log 8 + 4 log 4 5 8 log 5 + 5 8 log 8 = 3 8 + 2 4 5 8 log 5 + 5 8 = 4 5 8 log 5 3

H(Y X) = P XY (Fair, ) log P Y X ( Fair) P XY (Fair, ) log P Y X ( Fair) P XY (Fair, 2) log P Y X (2 Fair) P XY (Counterfeit, ) log P Y X ( Counterfeit) P XY (Counterfeit, ) log P Y X ( Counterfeit) P XY (Counterfeit, 2) log P Y X (2 Counterfeit) Now we need P XY : X = Fair X = Counterfeit Y = 2 4 8 Y = 2 2 4 Y =2 = 2 4 8 2 = 2 Hence, we have H(Y X) = log log log 8 4 4 2 8 4 log log log 2 = log 4 + log 2 + log 4 = 3/4 8 4 8 Thus, I(X; Y )=H(Y) H(Y X) = 4 5 8 log 5 3 4 =2 5 log 5.549 8 Problem 5.3 Due to the symmetry of the BEC the maximizing input distribution is P X () = P X () = /2. Hence, we have and P Y () = ( δ) 2 P Y ( ) = δ P Y () = ( δ) 2 4

H(Y )= ( 2 δ)log(( δ)) δ log δ ( 2 2 δ)log( ( δ)) 2 = δ ( δ)log( δ) δ log δ = δ + h(δ) H(Y X) =h(δ) Then we have C BEC = H(Y ) H(Y X) = δ + h(δ) h(δ) = δ Problem 5.4..57.3.43.5.27 u.23 u 2.2 u 3.5 u 4. u 5.5 u 6 u x u u 2 u 3 u 4 u 5 u 6 W =. +.57 +.43 +.3 +.5 = 2.45 5

Problem 5.5.4.2 u.2 u 2 u x u u 2 u 3 u 4 u 5 u 6..6.35.25.2 u 3.5 u 4.5 u 5. u 6 W =. +.6 +.4 +.35 +.25 = 2.6 Problem 5.6.6.3.3 u.2 u 2. u 3 u x u u 2 u 3 u 4 u 5 u 6 u 7..4.2.2. u 4. u 5. u 6. u 7 W =. +.6 +.4 +.3 +.2 +.2 = 2.7 6

Problem 5.7 (a) THE FRIEND IN NEED IS A FRIEND INDEED Step Entry # binary digits T 8 2 H 8 3 E 9 4 _ 5 F 6 R 7 I 8 EN 3 9 N D 2 _I 4 2 IN 4 3 N_ 4 4 _N 4 5 NE 4 6 EE 4 7 ED 4 8 D_ 5 9 _IS 5 2 S 3 2 _A 5 22 A 3 23 _F 5 24 FR 5 25 RI 5 26 IE 5 27 END 5 28 D_I 5 29 IND 5 3 DE 5 3 EED 5 32 D 5 total bits=22 7

(b) THE CAT IN THE CAR ATE THE RAT Step Entry # binary digits T 8 2 H 8 3 E 9 4 _ 5 C 6 A 7 T_ 3 8 _I 3 9 I N 2 _T 4 2 TH 4 3 HE 4 4 E_ 4 5 _C 4 6 CA 4 7 AR 4 8 R 3 9 _A 5 2 AT 5 2 TE 5 22 E_T 5 23 THE 5 24 E_R 5 25 RA 5 26 AT 5 total bits=66 (c) EARLY TO BED AND EARLY TO RISE MAKES A MAN WISE 8

Step Entry # binary digits E 8 2 A 8 3 R 9 4 L 5 Y 6 _ 7 T 8 O 9 _B 3 B 2 ED 4 2 D 2 3 _A 4 4 AN 4 5 N 2 6 D_ 4 7 _E 4 8 EA 5 9 AR 5 2 RL 5 2 LY 5 22 Y_ 5 23 _T 5 24 TO 5 25 O_ 5 26 _R 5 27 RI 5 28 I 3 29 S 3 3 E_ 5 3 _M 5 32 M 3 33 AK 5 34 K 4 35 ES 6 36 S_ 6 37 _A_ 6 38 _MA 6 39 AN_ 6 4 _W 6 4 W 4 42 IS 6 43 SE 6 44 E 6 total bits=323 9

(d) IF WE CANNOT DO AS WE WOULD WE WOULD DO AS WE CAN Step Entry # binary digits I 8 2 F 8 3 _ 9 4 W 5 E 6 _C 3 7 C 8 A 9 N NO 2 O 2 2 T 2 3 _D 4 4 D 2 5 O_ 4 6 _A 4 7 AS 4 8 S 3 9 _W 5 2 WE 5 2 E_ 5 22 _WO 5 23 OU 5 24 U 3 25 L 3 26 D_ 5 27 _WE 5 28 E_W 5 29 WO 5 3 OUL 5 3 LD 5 32 D_D 5 33 DO 5 34 O_A 6 35 AS_ 6 36 _WE_ 6 37 _CA 6 38 AN 6 39 N 6 total bits=285

(e) BETTER LATE THAN NEVER BUT BETTER NEVER LATE Step Entry # binary digits B 8 2 E 8 3 T 9 4 TE 5 ER 2 6 R 7 _ 8 L 9 A TE_ 4 _T 4 2 TH 4 3 H 2 4 AN 4 5 N 2 6 _N 4 7 NE 4 8 EV 5 9 V 3 2 ER_ 5 2 _B 5 22 BU 5 23 U 3 24 T_ 5 25 _BE 5 26 ET 5 27 TT 5 28 TER 5 29 R_ 5 3 _NE 5 3 EVE 5 32 ER_L 5 33 LA 5 34 AT 6 35 TE 6 total bits=237

(f) WHO CHATTERS WITH YOU WILL CHATTER ABOUT YOU Step Entry # binary digits W 8 2 H 8 3 O 9 4 _ 5 C 6 HA 3 7 A 8 T 9 TE E 2 R 2 2 S 2 3 _W 4 4 WI 4 5 I 2 6 TH 4 7 H_ 4 8 _Y 5 9 Y 3 2 OU 5 2 U 3 22 _WI 5 23 IL 5 24 L 3 25 L_ 5 26 _C 5 27 CH 5 28 HAT 5 29 TT 5 3 TER 5 3 R_ 5 32 _A 5 33 AB 5 34 B 4 35 OUT 6 36 T_ 6 37 _YO 6 38 OU 6 total bits=287 2

Problem 5.8 (a) IF YOU CANNOT BE WITH THE ONE YOU LOVE LOVE THE ONE YOU ARE WITH Problem 5.9 B = {,,, } (a) Yes, since the sum of any two codewords is a codeword. (b) N =6 K =logm =log4=2 R =2/6 =/3 (c) (d) u v is a linear encoder. u v is a nonlinear encoder (cf. Ch. 2). (e) d min =4. (f) d min = 4 =.5 = error. 2 2 3

Problem 5. (a) / + / / / (b) (c) d free =3 (d) r = 2 2 2 2 3 2 3 v = û = () (e) ê = Three channel errors! 4

Problem 5.2 (a) r = 2 3 3 2 2 2 2 3 2 2 û = () (b) r = v = v = ê = e = We had four channel errors to start with and introduced a new one in the decoding process! Problem 5.3 (a) Look at the trellis in Fig. 5.24. Then we see that the minimum (squared) Euclidean distance can be obtained as (b) For BPSK we have d 2(c) E (QPSK) = d2 E (, ) + d2 E (, ) + d2 E (, ) =4+2+4= d 2 E (BPSK) = 4 Hence, we have the coding gain for our coded QPSK scheme over uncoded BPSK d 2(c) E γ =log (QPSK) d 2 E (BPSK) =log =3.98 db 4 5

(c) r = 2 4 4 6 6 6 2 6 6 4 2 4 2 8 6 6 2 (Notice that we use the Euclidean distance, not the Hamming distance!) û = () 6