Solutions to Homework Set #3 Channel and Source coding

Size: px
Start display at page:

Download "Solutions to Homework Set #3 Channel and Source coding"

Transcription

1 Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use) that you send. (b) Source coding Rate: Assuming you have a file with 6 Ascii characters, where the alphabet of Ascii characters is 56. Assume that each Ascii character is represented by bits (binary alphabet before compression). After compressing it we get 4 6 bits. What is the compression rate? Solution: Rates. (a) (b) R = log 4 =. R = log (56) =.. Preprocessing the output. One is given a communication channel with transition probabilities p(y x) and channel capacity C = max I(X;Y). A helpful statistician preprocesses the output by forming Ỹ = g(y), yielding a channel p(ỹ x). He claims that this will strictly improve the capacity. (a) Show that he is wrong. (b) Under what conditions does he not strictly decrease the capacity? Solution: Preprocessing the output.

2 (a) The statistician calculates Ỹ = g(y). Since X Y Ỹ forms a Markov chain, we can apply the data processing inequality. Hence for every distribution on x, I(X;Y) I(X;Ỹ). Let be the distribution on x that maximizes I(X;Ỹ). Then C = max I(X;Y) I(X;Y) = I(X;Ỹ) = = maxi(x;ỹ) = C. Thus, the helpful suggestion is wrong and processing the output does not increase capacity. (b) We have equality (no decrease in capacity) in the above sequence of inequalities only if we have equality in data processing inequality, i.e., for the distribution that maximizes I(X;Ỹ), we have X Ỹ Y forming a Markov chain. Thus, Ỹ should be a sufficient statistic. 3. The Z channel. The Z-channel has binary input and output alphabets and transition probabilities p(y x) given by the following matrix: [ ] Q = x,y {,} / / Find the capacity of the Z-channel and the maximizing input probability distribution. Solution: The Z channel. First we express I(X; Y), the mutual information between the input and output of the Z-channel, as a function of α = Pr(X = ): H(Y X) = Pr(X = ) +Pr(X = ) = α H(Y) = H(Pr(Y = )) = H(α/) I(X;Y) = H(Y) H(Y X) = H(α/) α Since I(X;Y) is strictly concave on α (why?) and I(X;Y) = when α = and α =, the maximum mutual information is obtained for some value of α such that < α <.

3 Using elementary calculus, we determine that d dα I(X;Y) = log α/, α/ which is equal to zero for α = /5. (It is reasonable that Pr(X = ) < / since X = is the noisy input to the channel.) So the capacity of the Z-channel in bits is H(/5) /5 =.7.4 = Using two channels at once. Consider two discrete memoryless channels (X,p(y x ),Y ) and (X,p(y x ),Y ) with capacities C and C respectively. A new channel (X X,p(y x ) p(y x ),Y Y ) is formed in which x X and x X, are simultaneously sent, resulting in y,y. Find the capacity of this channel. Solution: Using two channels at once. Tofindthecapacityoftheproductchannel(X X,p(y,y x,x ),Y Y ), we have to find the distribution p(x,x ) on the input alphabet X X that maximizes I(X,X ;Y,Y ). Since the transition probabilities are given as p(y,y x,x ) = p(y x )p(y x ), p(x,x,y,y ) = p(x,x )p(y,y x,x ) = p(x,x )p(y x )p(y x ), Therefore, Y X X Y forms a Markov chain and I(X,X ;Y,Y ) = H(Y,Y ) H(Y,Y X,X ) = H(Y,Y ) H(Y X,X ) H(Y X,X )() = H(Y,Y ) H(Y X ) H(Y X ) () H(Y )+H(Y ) H(Y X ) H(Y X ) (3) = I(X ;Y )+I(X ;Y ), where Eqs. () and () follow from Markovity, and Eq. (3) is met with equality if X and X are independent and hence Y and Y are inde- 3

4 pendent. Therefore C = max I(X,X ;Y,Y ) p(x,x ) max I(X ;Y )+ max I(X ;Y ) p(x,x ) p(x,x ) = maxi(x ;Y )+maxi(x ;Y ) p(x ) p(x ) = C +C. with equality iff p(x,x ) = p (x )p (x ) and p (x ) and p (x ) are the distributions that maximize C and C respectively. 5. A channel with two independent looks at Y. Let Y and Y be conditionally independent and conditionally identically distributed given X. Thus p(y,y x) = p(y x)p(y x). (a) Show I(X;Y,Y ) = I(X;Y ) I(Y ;Y ). (b) Conclude that the capacity of the channel X (Y,Y ) is less than twice the capacity of the channel X Y Solution: A channel with two independent looks at Y. (a) I(X;Y,Y ) = H(Y,Y ) H(Y,Y X) (4) = H(Y )+H(Y ) I(Y ;Y ) H(Y X) H(Y X) (5) (since Y and Y are conditionally independent given X)(6) = I(X;Y )+I(X;Y ) I(Y ;Y ) (7) = I(X;Y ) I(Y ;Y ) (since Y and Y are conditionally (8). identically distributed) 4

5 (b) The capacity of the single look channel X Y is The capacity of the channel X (Y,Y ) is C = maxi(x;y ). (9) C = maxi(x;y,y ) () = maxi(x;y ) I(Y ;Y ) () maxi(x;y ) () = C. (3) Hence, two independent looks cannot be more than twice as good as one look. 6. Choice of channels. Find the capacity C of the union of channels (X,p (y x ),Y ) and (X,p (y x ),Y ) where, at each time, one can send a symbol over channel or over channel but not both. Assume the output alphabets are distinct and do not intersect. (a) Show C = C + C. (b) What is the capacity of this Channel? 3 p p p p 3 Solution: Choice of channels. (a) Let θ = {, if the signal is sent over the channel, if the signal is sent over the channel. 5

6 Consider the following communication scheme: The sender chooses between two channels according to Bern(α) coin flip. Then the channel input is X = (θ,x θ ). Since the output alphabets Y and Y are disjoint, θ is a function of Y, i.e. X Y θ. Therefore, I(X;Y) = I(X;Y,θ) = I(X θ,θ;y,θ) Thus, it follows that = I(θ;Y,θ)+I(X θ ;Y,θ θ) = I(θ;Y,θ)+I(X θ ;Y θ) = H(θ)+αI(X θ ;Y θ = )+( α)i(x θ ;Y θ = ) = H(α)+αI(X ;Y )+( α)i(x ;Y ). C = sup{h(α)+αc +( α)c }, α which is a strictly concave function on α. Hence, the maximum exists and by elementary calculus, one can easily show C = log ( C + C ), which is attained with α = C /( C + C ). If one interprets M = C as the effective number of noise free symbols, then the above result follows in a rather intuitive manner: we have M = C noise free symbols from channel, and M = C noise free symbols from channel. Since at each step we get to choose which channel to use, we essentially have M + M = C + C noise free symbols for the new channel. Therefore, the capacity of this channel is C = log ( C + C ). This argument is very similiar to the effective alphabet argument given in Problem 9, Chapter of the text. (b) From part (b) we get capacity is log( H(p) + ). 7. Cascaded BSCs. Considerthetwodiscretememorylesschannels(X,p (y x),y)and(y,p (z y),z). Let p (y x) and p (z y) be binary symmetric channels with crossover probabilities λ and λ respectively. 6

7 X λ λ λ λ λ λ λ λ Y Z (a) What is the capacity C of p (y x)? (b) What is the capacity C of p (z y)? (c) Wenowcascadethesechannels. Thusp 3 (z x) = y p (y x)p (z y). What is the capacity C 3 of p 3 (z x)? Show C 3 min{c,c }. (d) Now let us actively intervene between channels and, rather than passively transmitting y n. What is the capacity of channel followed by channel if you are allowed to decode the output y n of channel and then reencode it as ỹ n for transmission over channel? (Think W x n (W) y n ỹ n (y n ) z n Ŵ.) (e) What is the capacity of the cascade in part c) if the receiver can view both Y and Z? Solution: Cascaded BSCs. (a) This is a simple BSC with capacity C = H b (λ ). (b) Similarly, C = H b (λ ). (c) This is also a BSC channel with transition probability λ λ = λ ( λ )+( λ )λ and thus C 3 = H b (λ λ ). From the markov chain X Y Z we can see that and additionaly C 3 = I(X;Z) I(X;Y,Z) = I(X;Y) = C C 3 = I(X;Z) I(X,Y;Z) = I(Y;Z) = C 7

8 and thus C 3 min{c,c } (d) If one can reencode Y then we obtain that C 3 = min{c,c } since only one of the channels will be the bottleneck between X and Z. (e) From the Markov chain X Y Z we can see that 8. Channel capacity C 3 = I(X;Y,Z) = I(X;Y) = C (a) What is the capacity of the following channel (Input) X Y (Output) (b) Provide a simple scheme that can transmit at rate R = log 3 bits through this channel. Solution for Channel capacity (a) We can use the solution of previous home question: C = log ( C + C + C 3 ) Now we need to calculate the capacity of each channel: C = maxi(x;y) = H(Y) H(Y X) = = 8

9 C = maxi(x;y) = H(Y) H(Y X) = = C 3 = max I(X;Y) = max = max {H(Y) H(Y X)} [ ( ) p log p ( ) ( )] p +p 3 log p +p 3 p Assigning p 3 = p and derive against p : di(x;y) dp = p p ( log p ) + p p + ( ) log p = And as result p = 5 : C 3.3 And, finally: C = log( ).7 (b) Here is a simple code that achieves capacity. Encoding: You just use ternary representation of the message and send using,, but no 3 (or,,3 but no ) of the input channel. Decoding: map the ternary output into the message. 9

10 9. A channel with a switch. Consider the channel that is depicted in Fig., there are two channels with the conditional probabilities p(y x) and p(y x). These two channels have common input alphabet X and two disjoint output alphabets Y,Y (a symbol that appears in Y can t appear in Y ). The position of the switch is determined by R.V Z which is independent of X, where Pr(Z = ) = λ. p(y x) Y Z = X Y p(y x) Y Z = Figure : The channel. (a) Show that I(X;Y) = λi(x;y )+ λi(x;y ). (4) (b) The capacity of this system is given by C = max I(X;Y). Show that C λc + λc, (5) where C i = max I(X;Y i ). When is equality achieved? (c) The sub-channels defined by p(y x) and p(y x) are now given in the following figure, where p =. Find the input probability that maximizes I(X; Y). For this case, does the equality C = λc + λc stand? explain! Solution: Channel with state (a) Since the alphabet Y,Y are disjoint the markov chain X Y Z holds. Additionally, X and Z are independent and thus we have

11 p X Y p p p p X Y p 3 Figure : (a) describes channel - BSC with transition probability p. (b) describes channel - Z channel with transition probability p. that I(X;Y) = I(X;Y Z) (b) Consider the following: = Pr(z = )I(X;Y z = )+Pr(z = )I(X;Y z = ) = λi(x;y z = )+ λi(x;y z = ) = λi(x;y )+ λi(x;y ) C = maxi(x;y) = max{λi(x;y )+ λi(x;y )} max {λi(x;y )}+max { λi(x;y )} = λc + λc (c) Since p =, the capacity of the first channel is and thus we only need to maximize over the second channel. Thus, we would like to find such that λi(x;y ) = λ(h b ( α) αh b( ) is maximized where Pr(x = ) = α [,]. In this case the solution is α =.4. We can see that the equality in holds.. Channel with state A discrete memoryless (DM) state dependent channel with state space S is defined by an input alphabet X, an output alphabet Y and a set of channel transition matrices {p(y x,s)} s S. Namely, for each s S

12 the transmitter sees a different channel. The capacity of such a channel where the state is know causally to both encoder and decoder is given by: C = maxi(x;y S). (6) p(x s) Let S = 3 and the three different channels (one for each state s S) are as illustrated in the following figure ǫ ǫ δ δ ǫ δ δ ǫ S = S = S = 3 Figure 3: The three state dependent channel. The state process is i.i.d. according to the distribution p(s). (a) Find an expression for the capacity of the S-channel (the channel the transmitter sees given S = ) as a function of ǫ. (b) Find an expression for the capacity of the BSC (the channel the transmitter sees given S = ) as a function of δ. (c) Find an expression for the capacity of the Z-channel (the channel the transmitter sees given S = 3) as a function of ǫ. (d) Find an expression for the capacity of the DM state dependent channel (using formula (6)) for p(s) = [ ] as a function of 3 6 ǫ and δ. (e) LetusdefineaconditionalprobabilitymatrixP X S fortworandom variables X and S with X = {,} and S = {,,3}, by: [ ] 3, PX S = p(x = j s = i). (7) i=,j= WhatistheinputconditionalprobabilitymatrixP X S thatachieves the capacity you have found in (d)?

13 Solution: Channel with state (a) Denote the capacity of the S-Channel by C S = max I(X;Y S = ) p(x s=) = max [H(Y S = ) H(Y X,S = )] p(x s=) Assume that the input X is distributed according to X Bernoulli(α) for s =, and let us calculate the entropy terms: H(Y X,S = ) = x X H(Y X = x,s = ) Consider =αh(y X =,S = )+( α)h(y X =,S = ) =αh b (ǫ)+( α) = αh b (ǫ) p(y = s = ) =p(y =,x = s = )+p(y =,x = s = ) Then We want to maximize =α( ǫ) H(Y X,S = ) =H b (α( ǫ)). I(X;Y S = ) = H b (α( ǫ)) αh b (ǫ). Taking derivative with respect to α and find the roots of the derivation gives the capacity. (b) For the binary symmetric channel (BSC) the capacity is given by C BSC = H b (δ). (c) The capacity of the Z-channel and S-channel are equal because thechannelsareequivalentuptoswitchingwithandviceversa. (d) The capacity of the state dependent channel is C =maxi(x;y S) p(x s) [ p(s = )I(X;Y S = )+p(s = )I(X;Y S = ) =max p(x s) +p(s = 3)I(X;Y S = 3) ] = C S + 3 C BSC + 6 C Z 3

14 (e) The rows of the matrix P X S are the conditional probability function which achieve capacity for each of the sub channel. p(x = s = ) p(x = s = ) P X S = p(x = s = ) p(x = s = ) p(x = s = ) p(x = s = 3). Modulo channel (a) Consider the DMC defined as follows: Output Y = X Z where X, taking values in {,}, is the channel input, is the modulo- summation operation, and Z is binary channel noise uniform over {,} and independent of X. What is the capacity of this channel? (b) Consider the channel of the previous part, but suppose that instead of modulo- addition Y = X Z, we perform modulo-3 addition Y = X 3 Z. Now what is the capacity? Solution: Modulo channel (a) This is a simple case of a BSC with transition probability of p = / and thus the capacity in this case is C BSC =. (b) In this case we can model the channel as a BEC as studied in class. The input is of binary alphabet and the output is of a trenary alphabet. The probability of error is Pr(z = ) = p = and thus the capacity for this channel is C BEC = p =.. Cascaded BSCs: Given is a cascade of k identical and independent binary symmetric channels, each with crossover probability α. (a) In the case where no encoding or decoding is allowed at the intermediate terminals, what is the capacity of this cascaded channel as a function of k,α. (b) Now, assume that encoding and decoding is allowed at the intermediate points, what is the capacity as a function of k,α. (c) What is the capacity of each of the above settings in the case where the number of cascaded channels, k, goes to infinity? 4

15 Solution: Cascaded BSCs. (a) CascadedBSCsresultanewBSCwithanewparameter,β. Therefore, the capacity is C a = H (β) and the parameter β can be found as follows: ( ) k β = α i ( α) k i i {i k:i is odd} k ( ) k ( ) i = α i ( α) k i i i= = k ( ) k ( α) i ( α) k i i i= = ( ( α)k ). Answers of β as the initial sum over odd indices or the binary convolutional of k identical parameters α have been accepted as well. (b) We have seen in HW that in the case of encoding and decoding the capacity of the cascaded channel equals C b = min{c i }. Since all channels are identical with capacity H (α), we have that C b = H (α). (c) In (a), β.5 as k so C a. For (b), the number of cascaded channels does not change the capacity which remains C b = H (α). 5

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding Homework Set # Rates definitions, Channel Coding, Source-Channel coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page. EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Note that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )).

Note that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )). l I ~-16 / (a) (5 points) What is the capacity Cr of the channel X -> Y? What is C of the channel Y - Z? (b) (5 points) What is the capacity C 3 of the cascaded channel X -3 Z? (c) (5 points) A ow let.

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 2: Entropy and Mutual Information Chapter 2 outline Definitions Entropy Joint entropy, conditional entropy Relative entropy, mutual information Chain rules Jensen s inequality Log-sum inequality

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7 Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

ECE 534 Information Theory - Midterm 2

ECE 534 Information Theory - Midterm 2 ECE 534 Information Theory - Midterm Nov.4, 009. 3:30-4:45 in LH03. You will be given the full class time: 75 minutes. Use it wisely! Many of the roblems have short answers; try to find shortcuts. You

More information

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel Solutions to Homework Set #4 Differential Entropy and Gaussian Channel 1. Differential entropy. Evaluate the differential entropy h(x = f lnf for the following: (a Find the entropy of the exponential density

More information

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1st Semester 2010/11 Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1. Convexity of capacity region of broadcast channel. Let C R 2 be the capacity region of

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence

More information

Solutions to Set #2 Data Compression, Huffman code and AEP

Solutions to Set #2 Data Compression, Huffman code and AEP Solutions to Set #2 Data Compression, Huffman code and AEP. Huffman coding. Consider the random variable ( ) x x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0. 0.04 0.04 0.03 0.02 (a) Find a binary Huffman code

More information

Chapter I: Fundamental Information Theory

Chapter I: Fundamental Information Theory ECE-S622/T62 Notes Chapter I: Fundamental Information Theory Ruifeng Zhang Dept. of Electrical & Computer Eng. Drexel University. Information Source Information is the outcome of some physical processes.

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy

More information

Noisy-Channel Coding

Noisy-Channel Coding Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.

More information

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen) UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, 2017 Solutions to Take-Home Midterm (Prepared by Pinar Sen) 1. (30 points) Erasure broadcast channel. Let p(y 1,y 2 x) be a discrete

More information

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

CS 630 Basic Probability and Information Theory. Tim Campbell

CS 630 Basic Probability and Information Theory. Tim Campbell CS 630 Basic Probability and Information Theory Tim Campbell 21 January 2003 Probability Theory Probability Theory is the study of how best to predict outcomes of events. An experiment (or trial or event)

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016 Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

Can Feedback Increase the Capacity of the Energy Harvesting Channel? Can Feedback Increase the Capacity of the Energy Harvesting Channel? Dor Shaviv EE Dept., Stanford University shaviv@stanford.edu Ayfer Özgür EE Dept., Stanford University aozgur@stanford.edu Haim Permuter

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

Intermittent Communication

Intermittent Communication Intermittent Communication Mostafa Khoshnevisan, Student Member, IEEE, and J. Nicholas Laneman, Senior Member, IEEE arxiv:32.42v2 [cs.it] 7 Mar 207 Abstract We formulate a model for intermittent communication

More information

Towards a Theory of Information Flow in the Finitary Process Soup

Towards a Theory of Information Flow in the Finitary Process Soup Towards a Theory of in the Finitary Process Department of Computer Science and Complexity Sciences Center University of California at Davis June 1, 2010 Goals Analyze model of evolutionary self-organization

More information

1 Introduction to information theory

1 Introduction to information theory 1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5 UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, 2017 Solutions to Homework Set #5 3.18 Bounds on the quadratic rate distortion function. Recall that R(D) = inf F(ˆx x):e(x ˆX)2 DI(X; ˆX).

More information

Capacity Theorems for Relay Channels

Capacity Theorems for Relay Channels Capacity Theorems for Relay Channels Abbas El Gamal Department of Electrical Engineering Stanford University April, 2006 MSRI-06 Relay Channel Discrete-memoryless relay channel [vm 7] Relay Encoder Y n

More information

Example: Letter Frequencies

Example: Letter Frequencies Example: Letter Frequencies i a i p i 1 a 0.0575 2 b 0.0128 3 c 0.0263 4 d 0.0285 5 e 0.0913 6 f 0.0173 7 g 0.0133 8 h 0.0313 9 i 0.0599 10 j 0.0006 11 k 0.0084 12 l 0.0335 13 m 0.0235 14 n 0.0596 15 o

More information

Homework Set #2 Data Compression, Huffman code and AEP

Homework Set #2 Data Compression, Huffman code and AEP Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code

More information

Example: Letter Frequencies

Example: Letter Frequencies Example: Letter Frequencies i a i p i 1 a 0.0575 2 b 0.0128 3 c 0.0263 4 d 0.0285 5 e 0.0913 6 f 0.0173 7 g 0.0133 8 h 0.0313 9 i 0.0599 10 j 0.0006 11 k 0.0084 12 l 0.0335 13 m 0.0235 14 n 0.0596 15 o

More information

Example: Letter Frequencies

Example: Letter Frequencies Example: Letter Frequencies i a i p i 1 a 0.0575 2 b 0.0128 3 c 0.0263 4 d 0.0285 5 e 0.0913 6 f 0.0173 7 g 0.0133 8 h 0.0313 9 i 0.0599 10 j 0.0006 11 k 0.0084 12 l 0.0335 13 m 0.0235 14 n 0.0596 15 o

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan

The Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting

More information

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy Haykin_ch05_pp3.fm Page 207 Monday, November 26, 202 2:44 PM CHAPTER 5 Information Theory 5. Introduction As mentioned in Chapter and reiterated along the way, the purpose of a communication system is

More information

Relay Networks With Delays

Relay Networks With Delays Relay Networks With Delays Abbas El Gamal, Navid Hassanpour, and James Mammen Department of Electrical Engineering Stanford University, Stanford, CA 94305-9510 Email: {abbas, navid, jmammen}@stanford.edu

More information

EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16

EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16 EE539R: Problem Set 4 Assigned: 3/08/6, Due: 07/09/6. Cover and Thomas: Problem 3.5 Sets defined by probabilities: Define the set C n (t = {x n : P X n(x n 2 nt } (a We have = P X n(x n P X n(x n 2 nt

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately

More information

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets Jing Guo University of Cambridge jg582@cam.ac.uk Jossy Sayir University of Cambridge j.sayir@ieee.org Minghai Qin

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information 4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Lecture 11: Polar codes construction

Lecture 11: Polar codes construction 15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Lecture 1: Introduction, Entropy and ML estimation

Lecture 1: Introduction, Entropy and ML estimation 0-704: Information Processing and Learning Spring 202 Lecture : Introduction, Entropy and ML estimation Lecturer: Aarti Singh Scribes: Min Xu Disclaimer: These notes have not been subjected to the usual

More information

Distributed Source Coding Using LDPC Codes

Distributed Source Coding Using LDPC Codes Distributed Source Coding Using LDPC Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete May 29, 2010 Telecommunications Laboratory (TUC) Distributed Source Coding

More information

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels

Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Optimal Encoding Schemes for Several Classes of Discrete Degraded Broadcast Channels Bike Xie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract arxiv:0811.4162v4 [cs.it] 8 May 2009

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Discrete Memoryless Channels with Memoryless Output Sequences

Discrete Memoryless Channels with Memoryless Output Sequences Discrete Memoryless Channels with Memoryless utput Sequences Marcelo S Pinho Department of Electronic Engineering Instituto Tecnologico de Aeronautica Sao Jose dos Campos, SP 12228-900, Brazil Email: mpinho@ieeeorg

More information

The binary entropy function

The binary entropy function ECE 7680 Lecture 2 Definitions and Basic Facts Objective: To learn a bunch of definitions about entropy and information measures that will be useful through the quarter, and to present some simple but

More information

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 8: Differential entropy Chapter 8 outline Motivation Definitions Relation to discrete entropy Joint and conditional differential entropy Relative entropy and mutual information Properties AEP for

More information

Quiz 2 Date: Monday, November 21, 2016

Quiz 2 Date: Monday, November 21, 2016 10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

ELEMENT OF INFORMATION THEORY

ELEMENT OF INFORMATION THEORY History Table of Content ELEMENT OF INFORMATION THEORY O. Le Meur olemeur@irisa.fr Univ. of Rennes 1 http://www.irisa.fr/temics/staff/lemeur/ October 2010 1 History Table of Content VERSION: 2009-2010:

More information

On Dependence Balance Bounds for Two Way Channels

On Dependence Balance Bounds for Two Way Channels On Dependence Balance Bounds for Two Way Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ravit@umd.edu ulukus@umd.edu

More information

Information measures in simple coding problems

Information measures in simple coding problems Part I Information measures in simple coding problems in this web service in this web service Source coding and hypothesis testing; information measures A(discrete)source is a sequence {X i } i= of random

More information

Application of Information Theory, Lecture 7. Relative Entropy. Handout Mode. Iftach Haitner. Tel Aviv University.

Application of Information Theory, Lecture 7. Relative Entropy. Handout Mode. Iftach Haitner. Tel Aviv University. Application of Information Theory, Lecture 7 Relative Entropy Handout Mode Iftach Haitner Tel Aviv University. December 1, 2015 Iftach Haitner (TAU) Application of Information Theory, Lecture 7 December

More information

Capacity of a Class of Semi-Deterministic Primitive Relay Channels

Capacity of a Class of Semi-Deterministic Primitive Relay Channels Capacity of a Class of Semi-Deterministic Primitive Relay Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 2742 ravit@umd.edu

More information