Chapter 5 Solutions. Problem 5.1

Size: px
Start display at page:

Download "Chapter 5 Solutions. Problem 5.1"

Transcription

1 Chapter 5 Solutions Problem 5. Since X = X X 2 X 3 takes on the values,,, with equal probability, it follows that P Xi () = P Xi () = /2, i =, 2, 3. Furthermore, P X X 2 () = P X X 2 () = P X X 2 () = P X X 2 () = /4. (a) H(X )= P X () log P X () P X () log P X () = log log = log =log2=bit (b) H(X X 2 )= P X X 2 () log P X X 2 () P X X 2 () log P X X 2 () P X X 2 () log P X X 2 () P X X 2 () log P X X 2 () = log log log log = log =log4=2bits 4 (c) H(X 2 X )= P X X 2 () log P X2 X ( ) P X X 2 () log P X2 X ( ) P X X 2 () log P X2 X ( ) P X X 2 () log P X2 X ( ) But P X2 X ( ) = P X2 X ( ) = P X2 X ( ) = P X2 X ( ) = /2 Hence, we have H(X 2 X )= log log log log = log =log2=bit 2 Alternatively, we use the formula H(X X 2 )=H(X )+H(X 2 X ) or, equivalently, H(X 2 X )=H(X X 2 ) H(X ) =2 =bit (d) H(X X 2 X 3 )= P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () P X X 2 X 3 () log P X X 2 X 3 () Insert P X X 2 X 3 ()=P X X 2 X 3 ()=P X X 2 X 3 ()=P X X 2 X 3 () = /4 and P X X 2 X 3 () = P X X 2 X 3 () = P X X 2 X 3 () = P X X 2 X 3 () = Hence,

2 H(X X 2 X 3 )= log log log log log log log log = log =log4=2bits 4 Alternatively, we have 4 possible outcomes and they are equiprobable; that is, H(X X 2 X 3 )=logl where L =4. Hence, H(X X 2 X 3 )=log4=2bits Again alternatively, H(X X 2 X 3 )=H(X X 2 )+H(X 3 X X 2 ) What is H(X 3 X X 2 )? That is, what is the uncertainty about X 3 when we know X X 2? For all four outcomes,,, weseethatifweknow the first two binary digits, then we also know the third one; in other words, the uncertainty about X 3 is, if we know X X 2!Thatis, H(X 3 X X 2 )= and, hence, H(X X 2 X 3 )=H(X X 2 )=2bits (e) See (d), the last alternative! (f) I(X ; X 3 )=H(X ) H(X X 3 ) What is H(X X 3 ), that is, the uncertainty about X when we know X 3?If X 3 =, then we have two possibilities for X, namely and and they are equiprobable! The same thing holds for X 3 =. Hence, we conclude that H(X X 3 )=bit Thus, we have I(X ; X 3 )=H(X ) H(X X 3 ) = =bit We get no information about X by observing X 3! Regardless if X 3 is or, it is still 5 5 for X to be or. (g) I(X X 2 ; X 3 )=H(X 3 ) H(X 3 X X 2 ) = =bit If we know nothing about X X 2, then our uncertainty about X 3 is bit; 5 5 to be or! But if we know X X 2, then we also know X 3! Hence, then our uncertainty about X 3 is. Thus, by observing X X 2 we get bit of information about X 3. Alternatively, if we do not know X 3, then our uncertainty about X X 2 is 2 bits (cf. (b)). But if we know X 3,regardlessifitisoch,wehaveonly 2

3 two possibilities for X X 2, that is, our uncertainty about X X 2 is bit. By observing X 3, the uncertainty about X X 2 is reduced from 2 bits to bit; we get bit of information about X X 2 by observing X 3! Problem 5.2 Let X denote the coin, that is, P X (Fair )=P X (Counterfeit) =/2. Let Y be the number of Heads. Consider the following scheme: /4 8 Y = X = Fair 2 /2 X = Counterfeit 2 /4 4 Y = 5 8 Y =2 If X = Counterfeit, then we know for sure that we will get Y =2Heads when we flip the coin twice. Hence, we denote the branch between X = Counterfeit and Y = 2 by the conditional probability P Y X (2 Counterfeit) =. If X = Fair, then we get the combinations Tail-Tail, Tail-Head, Head-Tail, Head- Head with equal probability when we flip the fair coin twice. That is, Y =(Tail-Tail) occurs with probability /4, Y = (Tail-Head or Head-Tail) occurs with probability /2 (half of the four possibilities), and Y = 2(Head-Head) occurs with probability /4. Next we add the two probabilities for Y =2, = 5 8,asshowninthe scheme above. Now we are well-prepared to compute I(X; Y )=H(Y ) H(Y X). H(Y )= 8 log 8 4 log log 5 8 = 8 log log log log 8 = log = log 5 3

4 H(Y X) = P XY (Fair, ) log P Y X ( Fair) P XY (Fair, ) log P Y X ( Fair) P XY (Fair, 2) log P Y X (2 Fair) P XY (Counterfeit, ) log P Y X ( Counterfeit) P XY (Counterfeit, ) log P Y X ( Counterfeit) P XY (Counterfeit, 2) log P Y X (2 Counterfeit) Now we need P XY : X = Fair X = Counterfeit Y = Y = Y =2 = = 2 Hence, we have H(Y X) = log log log log log log 2 = log 4 + log 2 + log 4 = 3/ Thus, I(X; Y )=H(Y) H(Y X) = log =2 5 log Problem 5.3 Due to the symmetry of the BEC the maximizing input distribution is P X () = P X () = /2. Hence, we have and P Y () = ( δ) 2 P Y ( ) = δ P Y () = ( δ) 2 4

5 H(Y )= ( 2 δ)log(( δ)) δ log δ ( 2 2 δ)log( ( δ)) 2 = δ ( δ)log( δ) δ log δ = δ + h(δ) H(Y X) =h(δ) Then we have C BEC = H(Y ) H(Y X) = δ + h(δ) h(δ) = δ Problem u.23 u 2.2 u 3.5 u 4. u 5.5 u 6 u x u u 2 u 3 u 4 u 5 u 6 W = =

6 Problem u.2 u 2 u x u u 2 u 3 u 4 u 5 u u 3.5 u 4.5 u 5. u 6 W = = 2.6 Problem u.2 u 2. u 3 u x u u 2 u 3 u 4 u 5 u 6 u u 4. u 5. u 6. u 7 W = = 2.7 6

7 Problem 5.7 (a) THE FRIEND IN NEED IS A FRIEND INDEED Step Entry # binary digits T 8 2 H 8 3 E 9 4 _ 5 F 6 R 7 I 8 EN 3 9 N D 2 _I 4 2 IN 4 3 N_ 4 4 _N 4 5 NE 4 6 EE 4 7 ED 4 8 D_ 5 9 _IS 5 2 S 3 2 _A 5 22 A 3 23 _F 5 24 FR 5 25 RI 5 26 IE 5 27 END 5 28 D_I 5 29 IND 5 3 DE 5 3 EED 5 32 D 5 total bits=22 7

8 (b) THE CAT IN THE CAR ATE THE RAT Step Entry # binary digits T 8 2 H 8 3 E 9 4 _ 5 C 6 A 7 T_ 3 8 _I 3 9 I N 2 _T 4 2 TH 4 3 HE 4 4 E_ 4 5 _C 4 6 CA 4 7 AR 4 8 R 3 9 _A 5 2 AT 5 2 TE 5 22 E_T 5 23 THE 5 24 E_R 5 25 RA 5 26 AT 5 total bits=66 (c) EARLY TO BED AND EARLY TO RISE MAKES A MAN WISE 8

9 Step Entry # binary digits E 8 2 A 8 3 R 9 4 L 5 Y 6 _ 7 T 8 O 9 _B 3 B 2 ED 4 2 D 2 3 _A 4 4 AN 4 5 N 2 6 D_ 4 7 _E 4 8 EA 5 9 AR 5 2 RL 5 2 LY 5 22 Y_ 5 23 _T 5 24 TO 5 25 O_ 5 26 _R 5 27 RI 5 28 I 3 29 S 3 3 E_ 5 3 _M 5 32 M 3 33 AK 5 34 K 4 35 ES 6 36 S_ 6 37 _A_ 6 38 _MA 6 39 AN_ 6 4 _W 6 4 W 4 42 IS 6 43 SE 6 44 E 6 total bits=323 9

10 (d) IF WE CANNOT DO AS WE WOULD WE WOULD DO AS WE CAN Step Entry # binary digits I 8 2 F 8 3 _ 9 4 W 5 E 6 _C 3 7 C 8 A 9 N NO 2 O 2 2 T 2 3 _D 4 4 D 2 5 O_ 4 6 _A 4 7 AS 4 8 S 3 9 _W 5 2 WE 5 2 E_ 5 22 _WO 5 23 OU 5 24 U 3 25 L 3 26 D_ 5 27 _WE 5 28 E_W 5 29 WO 5 3 OUL 5 3 LD 5 32 D_D 5 33 DO 5 34 O_A 6 35 AS_ 6 36 _WE_ 6 37 _CA 6 38 AN 6 39 N 6 total bits=285

11 (e) BETTER LATE THAN NEVER BUT BETTER NEVER LATE Step Entry # binary digits B 8 2 E 8 3 T 9 4 TE 5 ER 2 6 R 7 _ 8 L 9 A TE_ 4 _T 4 2 TH 4 3 H 2 4 AN 4 5 N 2 6 _N 4 7 NE 4 8 EV 5 9 V 3 2 ER_ 5 2 _B 5 22 BU 5 23 U 3 24 T_ 5 25 _BE 5 26 ET 5 27 TT 5 28 TER 5 29 R_ 5 3 _NE 5 3 EVE 5 32 ER_L 5 33 LA 5 34 AT 6 35 TE 6 total bits=237

12 (f) WHO CHATTERS WITH YOU WILL CHATTER ABOUT YOU Step Entry # binary digits W 8 2 H 8 3 O 9 4 _ 5 C 6 HA 3 7 A 8 T 9 TE E 2 R 2 2 S 2 3 _W 4 4 WI 4 5 I 2 6 TH 4 7 H_ 4 8 _Y 5 9 Y 3 2 OU 5 2 U 3 22 _WI 5 23 IL 5 24 L 3 25 L_ 5 26 _C 5 27 CH 5 28 HAT 5 29 TT 5 3 TER 5 3 R_ 5 32 _A 5 33 AB 5 34 B 4 35 OUT 6 36 T_ 6 37 _YO 6 38 OU 6 total bits=287 2

13 Problem 5.8 (a) IF YOU CANNOT BE WITH THE ONE YOU LOVE LOVE THE ONE YOU ARE WITH Problem 5.9 B = {,,, } (a) Yes, since the sum of any two codewords is a codeword. (b) N =6 K =logm =log4=2 R =2/6 =/3 (c) (d) u v is a linear encoder. u v is a nonlinear encoder (cf. Ch. 2). (e) d min =4. (f) d min = 4 =.5 = error

14 Problem 5. (a) / + / / / (b) (c) d free =3 (d) r = v = û = () (e) ê = Three channel errors! 4

15 Problem 5.2 (a) r = û = () (b) r = v = v = ê = e = We had four channel errors to start with and introduced a new one in the decoding process! Problem 5.3 (a) Look at the trellis in Fig Then we see that the minimum (squared) Euclidean distance can be obtained as (b) For BPSK we have d 2(c) E (QPSK) = d2 E (, ) + d2 E (, ) + d2 E (, ) =4+2+4= d 2 E (BPSK) = 4 Hence, we have the coding gain for our coded QPSK scheme over uncoded BPSK d 2(c) E γ =log (QPSK) d 2 E (BPSK) =log =3.98 db 4 5

16 (c) r = (Notice that we use the Euclidean distance, not the Hamming distance!) û = () 6

A L A BA M A L A W R E V IE W

A L A BA M A L A W R E V IE W A L A BA M A L A W R E V IE W Volume 52 Fall 2000 Number 1 B E F O R E D I S A B I L I T Y C I V I L R I G HT S : C I V I L W A R P E N S I O N S A N D TH E P O L I T I C S O F D I S A B I L I T Y I N

More information

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land Ingmar Land, SIPCom8-1: Information Theory and Coding (2005 Spring) p.1 Overview Basic Concepts of Channel Coding Block Codes I:

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information

Lecture 12. Block Diagram

Lecture 12. Block Diagram Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data

More information

6.02 Fall 2011 Lecture #9

6.02 Fall 2011 Lecture #9 6.02 Fall 2011 Lecture #9 Claude E. Shannon Mutual information Channel capacity Transmission at rates up to channel capacity, and with asymptotically zero error 6.02 Fall 2011 Lecture 9, Slide #1 First

More information

02 Background Minimum background on probability. Random process

02 Background Minimum background on probability. Random process 0 Background 0.03 Minimum background on probability Random processes Probability Conditional probability Bayes theorem Random variables Sampling and estimation Variance, covariance and correlation Probability

More information

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39 Entropy Probability and Computing Presentation 22 Probability and Computing Presentation 22 Entropy 1/39 Introduction Why randomness and information are related? An event that is almost certain to occur

More information

NORTHWESTERN UNIVERSITY Tuesday, Oct 6th, 2015 ANSWERS FALL 2015 NU PUTNAM SELECTION TEST

NORTHWESTERN UNIVERSITY Tuesday, Oct 6th, 2015 ANSWERS FALL 2015 NU PUTNAM SELECTION TEST Problem A1. Show that log(1 + x) > x/(1 + x) for all x > 0. - Answer: We have that (log x) = 1/(1+x), and (x/(1+x)) = 1/(1+x) 2. Since 1/(1+x) > 1/(1 + x) 2 for x > 0, the function log x grows faster than

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

Exercises with solutions (Set D)

Exercises with solutions (Set D) Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where

More information

Convolutional Codes ddd, Houshou Chen. May 28, 2012

Convolutional Codes ddd, Houshou Chen. May 28, 2012 Representation I, II Representation III, IV trellis of Viterbi decoding Turbo codes Convolutional Codes ddd, Houshou Chen Department of Electrical Engineering National Chung Hsing University Taichung,

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7 Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling

More information

One Lesson of Information Theory

One Lesson of Information Theory Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

Some Basic Concepts of Probability and Information Theory: Pt. 2

Some Basic Concepts of Probability and Information Theory: Pt. 2 Some Basic Concepts of Probability and Information Theory: Pt. 2 PHYS 476Q - Southern Illinois University January 22, 2018 PHYS 476Q - Southern Illinois University Some Basic Concepts of Probability and

More information

EE 229B ERROR CONTROL CODING Spring 2005

EE 229B ERROR CONTROL CODING Spring 2005 EE 229B ERROR CONTROL CODING Spring 2005 Solutions for Homework 1 1. Is there room? Prove or disprove : There is a (12,7) binary linear code with d min = 5. If there were a (12,7) binary linear code with

More information

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70 Name : Roll No. :.... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/2011-12 2011 CODING & INFORMATION THEORY Time Allotted : 3 Hours Full Marks : 70 The figures in the margin indicate full marks

More information

Introduction to Information Theory

Introduction to Information Theory Introduction to Information Theory Gurinder Singh Mickey Atwal atwal@cshl.edu Center for Quantitative Biology Kullback-Leibler Divergence Summary Shannon s coding theorems Entropy Mutual Information Multi-information

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Definition 2.1. Let w be a word. Then the coset C + w of w is the set {c + w : c C}.

Definition 2.1. Let w be a word. Then the coset C + w of w is the set {c + w : c C}. 2.4. Coset Decoding i 2.4 Coset Decoding To apply MLD decoding, what we must do, given a received word w, is search through all the codewords to find the codeword c closest to w. This can be a slow and

More information

CS 630 Basic Probability and Information Theory. Tim Campbell

CS 630 Basic Probability and Information Theory. Tim Campbell CS 630 Basic Probability and Information Theory Tim Campbell 21 January 2003 Probability Theory Probability Theory is the study of how best to predict outcomes of events. An experiment (or trial or event)

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Electrical Engineering Written PhD Qualifier Exam Spring 2014

Electrical Engineering Written PhD Qualifier Exam Spring 2014 Electrical Engineering Written PhD Qualifier Exam Spring 2014 Friday, February 7 th 2014 Please do not write your name on this page or any other page you submit with your work. Instead use the student

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

6.3 Bernoulli Trials Example Consider the following random experiments

6.3 Bernoulli Trials Example Consider the following random experiments 6.3 Bernoulli Trials Example 6.48. Consider the following random experiments (a) Flip a coin times. We are interested in the number of heads obtained. (b) Of all bits transmitted through a digital transmission

More information

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019 Lecture 11: Information theory DANIEL WELLER THURSDAY, FEBRUARY 21, 2019 Agenda Information and probability Entropy and coding Mutual information and capacity Both images contain the same fraction of black

More information

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS EC 32 (CR) Total No. of Questions :09] [Total No. of Pages : 02 III/IV B.Tech. DEGREE EXAMINATIONS, APRIL/MAY- 207 Second Semester ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS Time: Three Hours

More information

HW Solution 3 Due: July 15

HW Solution 3 Due: July 15 ECS 315: Probability and Random Processes 2010/1 HW Solution 3 Due: July 15 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) A part of ONE question will be graded. Of course, you do not know which problem

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

176 5 t h Fl oo r. 337 P o ly me r Ma te ri al s

176 5 t h Fl oo r. 337 P o ly me r Ma te ri al s A g la di ou s F. L. 462 E l ec tr on ic D ev el op me nt A i ng er A.W.S. 371 C. A. M. A l ex an de r 236 A d mi ni st ra ti on R. H. (M rs ) A n dr ew s P. V. 326 O p ti ca l Tr an sm is si on A p ps

More information

And for polynomials with coefficients in F 2 = Z/2 Euclidean algorithm for gcd s Concept of equality mod M(x) Extended Euclid for inverses mod M(x)

And for polynomials with coefficients in F 2 = Z/2 Euclidean algorithm for gcd s Concept of equality mod M(x) Extended Euclid for inverses mod M(x) Outline Recall: For integers Euclidean algorithm for finding gcd s Extended Euclid for finding multiplicative inverses Extended Euclid for computing Sun-Ze Test for primitive roots And for polynomials

More information

Shannon's Theory of Communication

Shannon's Theory of Communication Shannon's Theory of Communication An operational introduction 5 September 2014, Introduction to Information Systems Giovanni Sileno g.sileno@uva.nl Leibniz Center for Law University of Amsterdam Fundamental

More information

EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018

EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018 Please submit the solutions on Gradescope. Some definitions that may be useful: EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018 Definition 1: A sequence of random variables X

More information

P a g e 3 6 of R e p o r t P B 4 / 0 9

P a g e 3 6 of R e p o r t P B 4 / 0 9 P a g e 3 6 of R e p o r t P B 4 / 0 9 p r o t e c t h um a n h e a l t h a n d p r o p e r t y fr om t h e d a n g e rs i n h e r e n t i n m i n i n g o p e r a t i o n s s u c h a s a q u a r r y. J

More information

Problem Set 7 Due March, 22

Problem Set 7 Due March, 22 EE16: Probability and Random Processes SP 07 Problem Set 7 Due March, Lecturer: Jean C. Walrand GSI: Daniel Preda, Assane Gueye Problem 7.1. Let u and v be independent, standard normal random variables

More information

P a g e 5 1 of R e p o r t P B 4 / 0 9

P a g e 5 1 of R e p o r t P B 4 / 0 9 P a g e 5 1 of R e p o r t P B 4 / 0 9 J A R T a l s o c o n c l u d e d t h a t a l t h o u g h t h e i n t e n t o f N e l s o n s r e h a b i l i t a t i o n p l a n i s t o e n h a n c e c o n n e

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

Entropy as a measure of surprise

Entropy as a measure of surprise Entropy as a measure of surprise Lecture 5: Sam Roweis September 26, 25 What does information do? It removes uncertainty. Information Conveyed = Uncertainty Removed = Surprise Yielded. How should we quantify

More information

COMPSCI 650 Applied Information Theory Apr 5, Lecture 18. Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei

COMPSCI 650 Applied Information Theory Apr 5, Lecture 18. Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei COMPSCI 650 Applied Information Theory Apr 5, 2016 Lecture 18 Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei 1 Correcting Errors in Linear Codes Suppose someone is to send

More information

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i

More information

Information Theory (Information Theory by J. V. Stone, 2015)

Information Theory (Information Theory by J. V. Stone, 2015) Information Theory (Information Theory by J. V. Stone, 2015) Claude Shannon (1916 2001) Shannon, C. (1948). A mathematical theory of communication. Bell System Technical Journal, 27:379 423. A mathematical

More information

Quantitative Biology Lecture 3

Quantitative Biology Lecture 3 23 nd Sep 2015 Quantitative Biology Lecture 3 Gurinder Singh Mickey Atwal Center for Quantitative Biology Summary Covariance, Correlation Confounding variables (Batch Effects) Information Theory Covariance

More information

Solutions or answers to Final exam in Error Control Coding, October 24, G eqv = ( 1+D, 1+D + D 2)

Solutions or answers to Final exam in Error Control Coding, October 24, G eqv = ( 1+D, 1+D + D 2) Solutions or answers to Final exam in Error Control Coding, October, Solution to Problem a) G(D) = ( +D, +D + D ) b) The rate R =/ and ν i = ν = m =. c) Yes, since gcd ( +D, +D + D ) =+D + D D j. d) An

More information

Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift

Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift Ching-Yao Su Directed by: Prof. Po-Ning Chen Department of Communications Engineering, National Chiao-Tung University July

More information

T h e C S E T I P r o j e c t

T h e C S E T I P r o j e c t T h e P r o j e c t T H E P R O J E C T T A B L E O F C O N T E N T S A r t i c l e P a g e C o m p r e h e n s i v e A s s es s m e n t o f t h e U F O / E T I P h e n o m e n o n M a y 1 9 9 1 1 E T

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

Information. = more information was provided by the outcome in #2

Information. = more information was provided by the outcome in #2 Outline First part based very loosely on [Abramson 63]. Information theory usually formulated in terms of information channels and coding will not discuss those here.. Information 2. Entropy 3. Mutual

More information

A Gentle Tutorial on Information Theory and Learning. Roni Rosenfeld. Carnegie Mellon University

A Gentle Tutorial on Information Theory and Learning. Roni Rosenfeld. Carnegie Mellon University A Gentle Tutorial on Information Theory and Learning Roni Rosenfeld Mellon University Mellon Outline First part based very loosely on [Abramson 63]. Information theory usually formulated in terms of information

More information

List Decoding: Geometrical Aspects and Performance Bounds

List Decoding: Geometrical Aspects and Performance Bounds List Decoding: Geometrical Aspects and Performance Bounds Maja Lončar Department of Information Technology Lund University, Sweden Summer Academy: Progress in Mathematics for Communication Systems Bremen,

More information

Gambling and Information Theory

Gambling and Information Theory Gambling and Information Theory Giulio Bertoli UNIVERSITEIT VAN AMSTERDAM December 17, 2014 Overview Introduction Kelly Gambling Horse Races and Mutual Information Some Facts Shannon (1948): definitions/concepts

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

MATH 3C: MIDTERM 1 REVIEW. 1. Counting MATH 3C: MIDTERM REVIEW JOE HUGHES. Counting. Imagine that a sports betting pool is run in the following way: there are 20 teams, 2 weeks, and each week you pick a team to win. However, you can t pick

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch Monty Hall Puzzle Example: You are asked to select one of the three doors to open. There is a large prize behind one of the doors and if you select that door, you win the prize. After you select a door,

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Cyclic codes. Vahid Meghdadi Reference: Error Correction Coding by Todd K. Moon. February 2008

Cyclic codes. Vahid Meghdadi Reference: Error Correction Coding by Todd K. Moon. February 2008 Cyclic codes Vahid Meghdadi Reference: Error Correction Coding by Todd K. Moon February 2008 1 Definitions Definition 1. A ring < R, +,. > is a set R with two binary operation + (addition) and. (multiplication)

More information

1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g

1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g Exercise Generator polynomials of a convolutional code, given in binary form, are g 0, g 2 0 ja g 3. a) Sketch the encoding circuit. b) Sketch the state diagram. c) Find the transfer function TD. d) What

More information

Coding of memoryless sources 1/35

Coding of memoryless sources 1/35 Coding of memoryless sources 1/35 Outline 1. Morse coding ; 2. Definitions : encoding, encoding efficiency ; 3. fixed length codes, encoding integers ; 4. prefix condition ; 5. Kraft and Mac Millan theorems

More information

Lecture 1. ABC of Probability

Lecture 1. ABC of Probability Math 408 - Mathematical Statistics Lecture 1. ABC of Probability January 16, 2013 Konstantin Zuev (USC) Math 408, Lecture 1 January 16, 2013 1 / 9 Agenda Sample Spaces Realizations, Events Axioms of Probability

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006

Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006 Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006 Fabio Grazioso... July 3, 2006 1 2 Contents 1 Lecture 1, Entropy 4 1.1 Random variable...............................

More information

Lecture 12: November 6, 2017

Lecture 12: November 6, 2017 Information and Coding Theory Autumn 017 Lecturer: Madhur Tulsiani Lecture 1: November 6, 017 Recall: We were looking at codes of the form C : F k p F n p, where p is prime, k is the message length, and

More information

Chapter 2. Matrix Arithmetic. Chapter 2

Chapter 2. Matrix Arithmetic. Chapter 2 Matrix Arithmetic Matrix Addition and Subtraction Addition and subtraction act element-wise on matrices. In order for the addition/subtraction (A B) to be possible, the two matrices A and B must have the

More information

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE) ECE 74 - Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE) 1. A Huffman code finds the optimal codeword to assign to a given block of source symbols. (a) Show that cannot be a Huffman

More information

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Lecture 16 Agenda for the lecture Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Variable-length source codes with error 16.1 Error-free coding schemes 16.1.1 The Shannon-Fano-Elias

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

Lecture 1: Shannon s Theorem

Lecture 1: Shannon s Theorem Lecture 1: Shannon s Theorem Lecturer: Travis Gagie January 13th, 2015 Welcome to Data Compression! I m Travis and I ll be your instructor this week. If you haven t registered yet, don t worry, we ll work

More information

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes.

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes. 5 Binary Codes You have already seen how check digits for bar codes (in Unit 3) and ISBN numbers (Unit 4) are used to detect errors. Here you will look at codes relevant for data transmission, for example,

More information

On the exact bit error probability for Viterbi decoding of convolutional codes

On the exact bit error probability for Viterbi decoding of convolutional codes On the exact bit error probability for Viterbi decoding of convolutional codes Irina E. Bocharova, Florian Hug, Rolf Johannesson, and Boris D. Kudryashov Dept. of Information Systems Dept. of Electrical

More information

Low-Density Parity-Check Codes

Low-Density Parity-Check Codes Department of Computer Sciences Applied Algorithms Lab. July 24, 2011 Outline 1 Introduction 2 Algorithms for LDPC 3 Properties 4 Iterative Learning in Crowds 5 Algorithm 6 Results 7 Conclusion PART I

More information

INTRODUCTION TO INFORMATION THEORY

INTRODUCTION TO INFORMATION THEORY INTRODUCTION TO INFORMATION THEORY KRISTOFFER P. NIMARK These notes introduce the machinery of information theory which is a eld within applied mathematics. The material can be found in most textbooks

More information

ELEC 519A Selected Topics in Digital Communications: Information Theory. Hamming Codes and Bounds on Codes

ELEC 519A Selected Topics in Digital Communications: Information Theory. Hamming Codes and Bounds on Codes ELEC 519A Selected Topics in Digital Communications: Information Theory Hamming Codes and Bounds on Codes Single Error Correcting Codes 2 Hamming Codes (7,4,3) Hamming code 1 0 0 0 0 1 1 0 1 0 0 1 0 1

More information

Probability: Sets, Sample Spaces, Events

Probability: Sets, Sample Spaces, Events Probability: Sets, Sample Spaces, Events Engineering Statistics Section 2.1 Josh Engwer TTU 01 February 2016 Josh Engwer (TTU) Probability: Sets, Sample Spaces, Events 01 February 2016 1 / 29 The Need

More information

Coding on a Trellis: Convolutional Codes

Coding on a Trellis: Convolutional Codes .... Coding on a Trellis: Convolutional Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete November 6th, 2008 Telecommunications Laboratory (TUC) Coding on a Trellis:

More information

Chapter 2: Source coding

Chapter 2: Source coding Chapter 2: meghdadi@ensil.unilim.fr University of Limoges Chapter 2: Entropy of Markov Source Chapter 2: Entropy of Markov Source Markov model for information sources Given the present, the future is independent

More information

Intro to Information Theory

Intro to Information Theory Intro to Information Theory Math Circle February 11, 2018 1. Random variables Let us review discrete random variables and some notation. A random variable X takes value a A with probability P (a) 0. Here

More information

Solutions to Set #2 Data Compression, Huffman code and AEP

Solutions to Set #2 Data Compression, Huffman code and AEP Solutions to Set #2 Data Compression, Huffman code and AEP. Huffman coding. Consider the random variable ( ) x x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0. 0.04 0.04 0.03 0.02 (a) Find a binary Huffman code

More information

Error Detection and Correction: Hamming Code; Reed-Muller Code

Error Detection and Correction: Hamming Code; Reed-Muller Code Error Detection and Correction: Hamming Code; Reed-Muller Code Greg Plaxton Theory in Programming Practice, Spring 2005 Department of Computer Science University of Texas at Austin Hamming Code: Motivation

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

Advanced Topics in Information Theory

Advanced Topics in Information Theory Advanced Topics in Information Theory Lecture Notes Stefan M. Moser c Copyright Stefan M. Moser Signal and Information Processing Lab ETH Zürich Zurich, Switzerland Institute of Communications Engineering

More information

Probability: Part 1 Naima Hammoud

Probability: Part 1 Naima Hammoud Probability: Part 1 Naima ammoud Feb 7, 2017 Motivation ossing a coin Rolling a die Outcomes: eads or ails Outcomes: 1, 2, 3, 4, 5 or 6 Defining Probability If I toss a coin, there is a 50% chance I will

More information

Digital Communications

Digital Communications Digital Communications Chapter 8: Trellis and Graph Based Codes Saeedeh Moloudi May 7, 2014 Outline 1 Introduction 2 Convolutional Codes 3 Decoding of Convolutional Codes 4 Turbo Codes May 7, 2014 Proakis-Salehi

More information

1. Basics of Information

1. Basics of Information 1. Basics of Information 6.004x Computation Structures Part 1 Digital Circuits Copyright 2015 MIT EECS 6.004 Computation Structures L1: Basics of Information, Slide #1 What is Information? Information,

More information

6.02 Fall 2012 Lecture #1

6.02 Fall 2012 Lecture #1 6.02 Fall 2012 Lecture #1 Digital vs. analog communication The birth of modern digital communication Information and entropy Codes, Huffman coding 6.02 Fall 2012 Lecture 1, Slide #1 6.02 Fall 2012 Lecture

More information

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road 517583 QUESTION BANK (DESCRIPTIVE) Subject with Code : CODING THEORY & TECHNIQUES(16EC3810) Course & Branch: M.Tech - DECS

More information

1 Introduction to information theory

1 Introduction to information theory 1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through

More information

MATH/MTHE 406 Homework Assignment 2 due date: October 17, 2016

MATH/MTHE 406 Homework Assignment 2 due date: October 17, 2016 MATH/MTHE 406 Homework Assignment 2 due date: October 17, 2016 Notation: We will use the notations x 1 x 2 x n and also (x 1, x 2,, x n ) to denote a vector x F n where F is a finite field. 1. [20=6+5+9]

More information

Topics. Probability Theory. Perfect Secrecy. Information Theory

Topics. Probability Theory. Perfect Secrecy. Information Theory Topics Probability Theory Perfect Secrecy Information Theory Some Terms (P,C,K,E,D) Computational Security Computational effort required to break cryptosystem Provable Security Relative to another, difficult

More information

Trellis Coded Modulation

Trellis Coded Modulation Trellis Coded Modulation Trellis coded modulation (TCM) is a marriage between codes that live on trellises and signal designs We have already seen that trellises are the preferred way to view convolutional

More information

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00 NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR Sp ' 00 May 3 OPEN BOOK exam (students are permitted to bring in textbooks, handwritten notes, lecture notes

More information

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H. Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without

More information

2018/5/3. YU Xiangyu

2018/5/3. YU Xiangyu 2018/5/3 YU Xiangyu yuxy@scut.edu.cn Entropy Huffman Code Entropy of Discrete Source Definition of entropy: If an information source X can generate n different messages x 1, x 2,, x i,, x n, then the

More information

Section 3 Error Correcting Codes (ECC): Fundamentals

Section 3 Error Correcting Codes (ECC): Fundamentals Section 3 Error Correcting Codes (ECC): Fundamentals Communication systems and channel models Definition and examples of ECCs Distance For the contents relevant to distance, Lin & Xing s book, Chapter

More information

DISCRETE RANDOM VARIABLES: PMF s & CDF s [DEVORE 3.2]

DISCRETE RANDOM VARIABLES: PMF s & CDF s [DEVORE 3.2] DISCRETE RANDOM VARIABLES: PMF s & CDF s [DEVORE 3.2] PROBABILITY MASS FUNCTION (PMF) DEFINITION): Let X be a discrete random variable. Then, its pmf, denoted as p X(k), is defined as follows: p X(k) :=

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy

More information

Hidden Markov Models 1

Hidden Markov Models 1 Hidden Markov Models Dinucleotide Frequency Consider all 2-mers in a sequence {AA,AC,AG,AT,CA,CC,CG,CT,GA,GC,GG,GT,TA,TC,TG,TT} Given 4 nucleotides: each with a probability of occurrence of. 4 Thus, one

More information

Introduction to Decision Sciences Lecture 11

Introduction to Decision Sciences Lecture 11 Introduction to Decision Sciences Lecture 11 Andrew Nobel October 24, 2017 Basics of Counting Product Rule Product Rule: Suppose that the elements of a collection S can be specified by a sequence of k

More information