Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Size: px
Start display at page:

Download "Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122"

Transcription

1 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122

2 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel (DMC): a (stationary) DMC (X,P Y X, Y) consists of an input alphabet X, an output alphabet Y and a transition pmf P Y X such that P(Y n = y X n = x) = The memoryless property implies that ny P Y X (y i x i ) i=1 P Y i = y i M = m, X i =(x 1 (m),...,x i (m)),y i 1 =(y 1,...,y i 1 ) = P Y X (y i x i (m)) Copyright G. Caire (Sample Lectures) 123

3 Channel Coding Definition 12. consists of A block code C with rate R and block length n (an (R, n)-code) 1. A message set M =[1:2 nr ]= 1,...,2 nr. 2. A codebook {x(1),...,x(2 nr )}, i.e., an array of dimension 2 nr n over X, each row of which is a codeword. 3. An encoding function f : M! X n, such that f(m) =x(m) for m 2 M. 4. A decoding function g : Y n! M such that bm = g(y) is the decoded message. Copyright G. Caire (Sample Lectures) 124

4 Probability of Error Definition 13. Individual message probability of error: the conditional probability of error given that message m is transmitted is P e,m (C) =P(g(Y n ) 6= m X n = x(m)) Definition 14. Maximal probability of error: P e,max (C) = max m2m P e,m(c) Definition 15. Average probability of error: nr X2 P e (C) =2 nr m=1 P e,m (C) Copyright G. Caire (Sample Lectures) 125

5 Achievable Rates and Capacity Definition 16. Achievable rate: A rate R is said to be achievable if there exist a sequence of (R, n)-codes {C n } with probability of error P e,max (C n )! 0 as n!1. Definition 17. Channel capacity: The channel capacity is the supremum of all achievable rates. The above is an operational definition of capacity. A coding theorem (in information theory) consists of finding a formula, i.e., an explicit expression, for C in terms of the characteristics of the problem, i.e., in terms of P Y X. Copyright G. Caire (Sample Lectures) 126

6 Role of Mutual Information When the input X n is i.i.d. P X, then (X n,y n ) are i.i.d. with (X i,y i ) P X P Y X and Y n has an induced marginal distribution Y i P Y. There are 2 nh(y ) typical output sequences. If the input is typical, the probability of a non-typical output is negligible. For > 0 T (n) (Y x). > 0, for x 2 T (n) 0 (X) there are 2 nh(y X) typical outputs in How many non-overlapping typical output sets we can pack in T (n) (Y )? 2 nh(y ) 2 nh(y X) =2n(H(Y ) H(Y X)) =2 ni(x;y ) Copyright G. Caire (Sample Lectures) 127

7 The Channel Coding Theorem Theorem 11. Channel Coding Theorem: The capacity of the DMC (X,P Y X, Y) is given by C = max P X I(X; Y ) Example: Capacity of the BSC: A BSC is defined by Y = X Z, where X = Y = {0, 1}, addition is modulo-2, and Z Bernoulli-p. We have C = max P X I(X; Y ) = max P X {H(Y ) H(Y X)} = max P X {H(Y ) H(X Z X)} = max P X H(Y ) H 2 (p) =1 H 2 (p) Copyright G. Caire (Sample Lectures) 128

8 Capacity of the BEC Example: Capacity of the BEC: A BEC is defined by the diagram here below: e e e 1-e 0 e 1 We have C = max P X I(X; Y ) = max P X {H(X) H(X Y )} = max P X {H(X) eh(x)} = max P X (1 e)h(x) = 1 e where the last equality holds by choosing X to be Bernoulli Copyright G. Caire (Sample Lectures) 129

9 Symmetric channels Strongly symmetric channels: the transition matrix P with elements P r,s = P Y X (y = s x = r) has the property that every row is a permutation of the first row, and each column is a permutation of the first column. Weakly symmetric channels: every row of P is a permutation of the first row. For strongly symmetric channels: achieved by X Uniform on X. For weekly symmetric channels: C = log Y H(P 1,1,...,P 1, Y ) C = max P X H(Y ) H(P 1,1,...,P 1, Y ) Copyright G. Caire (Sample Lectures) 130

10 Additive-noise channels A discrete memoryless additive noise channel is defined by X = Y = F q (or, more in general, some additive group). P Y X is induced by the random mapping Y i = x i + Z i It follows that P Y X (y x) =P Z (y x) Hence, additive noise channels over F q are always strongly symmetric, and have capacity C = log q H(Z). Copyright G. Caire (Sample Lectures) 131

11 Computing capacity: convex maximization Maximization of the mutual information: convex optimization problem maximize subject to I(p, P) = X r X p r =1 r 0 apple p r apple 1, 8 r X p r P r,s log s P r,s P r 0 p r 0P r 0,s Recall that we have proved that the mutual information I(p, P) seen as a function of the input probability vector p for fixed transition matrix P, is a concave function. Copyright G. Caire (Sample Lectures) 132

12 Proof of the Channel Coding Theorem (1) Direct Part (achievability): we wish to prove that for any R<Cthere exists a sequence of (R, n) codes with vanishing error probability. Random coding: instead of building a specific family of codes (very difficult), we average over a random ensemble of codes. Fix P X and generate a 2 nr n codebook at random with i.i.d. entries P X. The codebook (natural ordering encoding function) is revealed to transmitter and receiver before the communication takes place. Encoding: x(m) is the m-th row of the generated codebook. Decoding: Joint Typicality Decoding. Let y denote the received observed channel output. Then, g(y) = bm 2 M if this is the unique index s.t. (x( bm), y) 2 T (n) (X, Y ) declare error otherwise Copyright G. Caire (Sample Lectures) 133

13 Proof of the Channel Coding Theorem (2) Analysis of the probability of error: P (n) e = X C P(C)P e (n) (C) = X C nr X2 = 2 nr nr X2 P(C)2 nr m=1 nr X2 = 2 nr = X C m=1 m=1 P e,m (C) X P(C)P e,m (C) C X P(C)P e,1 (C) C P(C)P e,1 (C) =P(g(Y n ) 6= 1 M = 1) Copyright G. Caire (Sample Lectures) 134

14 Proof of the Channel Coding Theorem (3) We let E = {g(y n ) 6= 1} denote the conditional error event, and notice that E E 1 [ E 2 where and E 1 = {(X n (1),Y n ) /2 T (n) (X, Y )} E 2 = {(X n (m),y n ) 2 T (n) (X, Y ) for some m 6= 1} By the Union Bound we have P(g(Y n ) 6= 1 M = 1) = P(E M = 1) apple P(E 1 [ E 2 M = 1) apple P(E 1 M = 1) + P(E 2 M = 1) Copyright G. Caire (Sample Lectures) 135

15 Proof of the Channel Coding Theorem (4) For the first term, notice that since (X n (1),Y n ) are jointly distributed according to Q n i=1 P X(x i )P Y X (y i x i ), then by the LLN P(E 1 M = 1)! 0, as n!1 For the second term, for m 6= 1we have that X n (m) and Y n are distributed as the product of marginals Q n i=1 P X(x i )P Y (y i ), therefore P(E 2 M = 1) apple 2 nr 2 n(i(x;y ) ( )) by the Packing Lemma. It follows that for any > 0 and sufficiently large n P (n) e apple +2 n(i(x;y ) R ( )) apple 2, for R<I(X; Y ) ( ) Copyright G. Caire (Sample Lectures) 136

16 Proof of the Channel Coding Theorem (5) Consequences: 1. For any n there exists at least one code that perform not worse than the ensemble average; 2. We can choose P X in order to maximize I(X; Y ); 3. ( ) vanishes by considering smaller and smaller. From average to maximal error probability (expurgation). Fix > 0, and let C n? be a code with P e (C n)? apple and rate R>C ( ). Sort the codewords such that P e,1 (C? n) apple P e,2 (C? n) apple apple P e,2 nr(c? n) Copyright G. Caire (Sample Lectures) 137

17 Proof of the Channel Coding Theorem (6) Define the expurgated code e C? n = {x(1),...,x(2 nr 1 )} (best half of the codewords). It follows that P e,max ( e C? n)=p e,2 nr 1(C? n) apple 2 END OF THE DIRECT PART Copyright G. Caire (Sample Lectures) 138

18 Proof of the Channel Coding Theorem (7) Proof of the converse part: There exist no codes with rate R>Cand arbitrarily small error probability. It is more convenient to prove the following equivalent statement: suppose that a sequence of (R, n)-codes (C n } exists, such that P e (C n )=P e (n)! 0. Then, it must be R apple C. We start from Fano Inequality: consider the joint n-letter distribution induced by the message M Uniform on M, by the encoding function f and by the channel P Y X. Then... Copyright G. Caire (Sample Lectures) 139

19 Proof of the Channel Coding Theorem (8) nr = H(M) = H(M) H(M Y n )+H(M Y n ) nx apple I(M; Y n )+1+nP e (n) R = I(M; Y i Y i apple = nx I(M,Y i 1 ; Y i )+n n = i=1 i=1 nx I(M,Y i i=1 nx I(X i ; Y i )+n n apple nc + n n i=1 1 )+n n 1,X i ; Y i )+n n END OF THE CONVERSE PART Copyright G. Caire (Sample Lectures) 140

20 A discrete (stationary) memoryless channel (DMC) (X, p(y x), Y) consists of two finite sets X, Y, and a collection of conditional pmfs p(y x) on Y Feedback Capacity (1) By memoryless, we mean that when the DMC (X, p(y x), Y) is used over n transmissions with message M and input X n, the output Yi at time i 2 [1 : n] i i 1 is distributed according to xp(y i xy, iy 1 ), m) = p(yi xi ) i (M, Faculty of Electrical Engineering and Computer Systems Department of Telecommunication Systems Information and Communication Theory Prof. Dr. Giuseppe Caire Einsteinufer Berlin Telefon +49 (0) Telefax +49 (0) caire@tu-berlin.de Sekretariat HFT6 Patrycja Chudzik M Encoder Xn YYin p(y x) Decoder M Telefon +49 (0) Telefax +49 (0) sekretariat@mk.tu-berlin.de Message Estimate Channel Yi 1 A (2nR, n) code with rate R consists of: 0 bits/transmission for the DMC (X, p(y x), Y) 1. A message set [1 : 2nR] = {1, 2,...,D 2 nr } 2. An encoding function (encoder) xn : [1 : 2nR]! X n that assigns a codeword xn(m) to each message m 2 [1 : 2nR]. The set C := {xn(1),..., xn (2 nr )} is referred to with as thefeedback codebookis defined by a sequence of encoding functions An (R, n)-code LNIT: Point-to-Point Communication ( :45) for i = 1,..., n, such that fi : M Y i 1!X xi = fi(m, y1,..., yi Copyright G. Caire (Sample Lectures) Page 3 2 1) 141

21 Feedback Capacity (2) This model, referred to as Shannon feedback channel, can be seen as an idealization of several protocols implemented today (e.g., ARQ, incremental redundancy, power control, rate allocation in wireless channels). Theorem 12. given by The feedback capacity of a discrete memoryless channel is C fb = C = max P X I(X; Y ) Proof: The converse for the channel without feedback holds verbatim for the case with feedback (check!). The achievability, obviously, also holds. Copyright G. Caire (Sample Lectures) 142

22 Feedback Capacity (3) Memoryless channels: feedback may greatly simplify operations, and achieve a much better behavior of the error probability versus n, at fixed rate R<C. Example of the BEC (Automatic Repetition request, ARQ). Channels with memory: feedback may achieves higher capacity. Multiuser networks: feedback may achieves (much) larger capacity. Copyright G. Caire (Sample Lectures) 143

23 Source-Channel Separation Theorem (1) We wish to transmit a stationary ergodic information source {V i } over the finite alphabet V over a discrete memoryless channel {X,P Y X, Y}. We fix the compression ratio = n/k, in terms of channel uses per source symbol. A joint source-channel code for this setup is defined by an encoding function and by a decoding function : V k! X n : Y n! V k Copyright G. Caire (Sample Lectures) 144

24 Source-Channel Separation Theorem (2) The error probability is defined by P (k,n) e = P(V k 6= (Y n ),X n = (V k )) We say that the source is transmissible over the channel with compression ratio if there exists a sequence of codes for k!1and n = k such that P e (k,n)! 0. Copyright G. Caire (Sample Lectures) 145

25 Source-Channel Separation Theorem (3) Theorem 13. Source-channel coding: A discrete memoryless source {V i } with V i 2 V is transmissible over the discrete memoryless channel {X,P Y X, Y} with compression ratio if H(V ) < C Conversely, if H(V ) > C, the source is not transmissible over the channel. Copyright G. Caire (Sample Lectures) 146

26 Source-Channel Separation Theorem (4) Proof of Achievability: Separation approach: we concatenate an almost lossless source code with a channel code. Almost lossless source code: if V k 2 T (k) (V ), encode it with k(h(v )+ ) bits, otherwise, declare error. Choose a sequence of capacity achieving codes C n? of rate R>C such that nr k(h(v )+ ) It follows that error probability not larger than 2 can be achieved if (C ) H(V )+ If H(V ) < C, we can find small enough and k large enough such that the above conditions can be satisfied. Copyright G. Caire (Sample Lectures) 147

27 Source-Channel Separation Theorem (5) Proof of Converse: We let b V k = (Y n ) denote the decoder output, then Fano inequality yields H(V k V b k ) apple 1+P e (k,n) k log V Assume that there exist a sequence of source-channel codes for k!1and n = k such that P (k,n) e! 0. Then... Copyright G. Caire (Sample Lectures) 148

28 Source-Channel Separation Theorem (6) H(V ) = 1 k H(V k ) = 1 k I(V k ; b V k )+ 1 k H(V k b V k ) apple apple apple 1 k I(V k ; b V k )+ 1 k + P (k,n) e n I(Xn ; Y n )+ k C + k log V from which we conclude that if such sequence of codes exists, then H(V) apple C. Copyright G. Caire (Sample Lectures) 149

29 Capacity-Cost Function (1) In certain problems it is meaningful to associate to the channel input a cost function. Let b : X! R + such that the per letter cost of an input sequence x is defined as b(x) = 1 nx b(x i ) n i=1 Example: Hamming weight cost: for X = F q, b(x) =1{x 6= 0}. Example: Quadratic cost (related to transmit power): for X R, b(x) =x 2. Copyright G. Caire (Sample Lectures) 150

30 Capacity-Cost Function (2) Theorem 14. Capacity-Cost Function: The capacity-cost function of the DMC (X,P Y X, Y) with input cost function b : X! R + is given by C(B) = max P X :E[b(X)]appleB I(X; Y ) Proof of Achievability (Sketch): For 0 > 0, choose an input distribution P X such that E[b(X)] apple B 0. Use P X through the random coding argument. Define an additional encoding error as follows: if the selected codeword x(m) violates the input cost, i.e., if 1 n P n i=1 b(x i(m)) >B, then declare an error. Copyright G. Caire (Sample Lectures) 151

31 Capacity-Cost Function (3) Include this error event in the union bound, and use the typical average lemma: if x(m) 2 T (n) (X), then (1 )(B 0 ) apple 1 n nx b(x i (m)) apple (1 + )(B 0 ) i=1 Choose and 0 such that (1+ )(B 0 ) <B, and conclude that the probability of encoding error can be made less than for sufficiently large n. Copyright G. Caire (Sample Lectures) 152

32 Capacity-Cost Function (4) Proof of Converse (Sketch): Assume that there exists a sequence of codes {C n } with rate R, such that! 0, and such that P (n) e nr X2 2 nr m=1 1 n nx b(x i (m)) apple B i=1 (notice that here we consider a relaxed version of the input constraint, that holds on average over all codewords, and not for each individual codeword). Define the function C( ) = max I(X; Y ) P X :E[b(X)]apple Notice that C( ) is non-decreasing, and concave in, in fact: C( 1 )+(1 )C( 2 ) apple C( 1 +(1 ) 2 ) Copyright G. Caire (Sample Lectures) 153

33 Capacity-Cost Function (5) For the n-letter distribution induced by using the codewords of C n with uniform probability over the message M, using Fano inequality as before, we obtain nr apple I(M; Y n )+1+nP e (n) R apple nx I(X i ; Y i )+n n apple i=1 nx C(E[b(X i )]) + n n i=1 apple nc 1 n apple! nx E[b(X i )] + n n i=1 nc(b)+n n Copyright G. Caire (Sample Lectures) 154

34 Capacity-Cost Function (6) Example: Capacity of the BSC with a Hamming weight input constraint: we can write I(X; Y )=H(Y) H(Y X) =H(Y ) H 2 (p) Hence, we have to maximize H(Y ) subject to E[1{X =1}] apple. Assume P X (1) = 2 [0, 1], then P Y (0) = (1 )(1 p)+ p, P Y (1) = (1 )p + (1 p) We use the compact notation p 0 = p probability vectors. Then, indicating cyclic convolution of the H(Y )=H(p )=H 2 ((1 )p + (1 p)) It can be checked that this is monotonically increasing for decreasing for 2 [1/2, 1]. Hence 2 [0, 1/2] and then C( )= H(p ) H2 (p), for 0 apple apple H 2 (p), for 1 2 < apple 1 Copyright G. Caire (Sample Lectures) 155

35 End of Lecture 5 Copyright G. Caire (Sample Lectures) 156

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157 Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

Lecture 4: Linear Codes. Copyright G. Caire 88

Lecture 4: Linear Codes. Copyright G. Caire 88 Lecture 4: Linear Codes Copyright G. Caire 88 Linear codes over F q We let X = F q for some prime power q. Most important case: q =2(binary codes). Without loss of generality, we may represent the information

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence

More information

Solutions to Homework Set #3 Channel and Source coding

Solutions to Homework Set #3 Channel and Source coding Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)

More information

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016 Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).

More information

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

Distributed Lossless Compression. Distributed lossless compression system

Distributed Lossless Compression. Distributed lossless compression system Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Multicoding Schemes for Interference Channels

Multicoding Schemes for Interference Channels Multicoding Schemes for Interference Channels 1 Ritesh Kolte, Ayfer Özgür, Haim Permuter Abstract arxiv:1502.04273v1 [cs.it] 15 Feb 2015 The best known inner bound for the 2-user discrete memoryless interference

More information

Noisy-Channel Coding

Noisy-Channel Coding Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.

More information

Simultaneous Nonunique Decoding Is Rate-Optimal

Simultaneous Nonunique Decoding Is Rate-Optimal Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA

More information

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

Can Feedback Increase the Capacity of the Energy Harvesting Channel? Can Feedback Increase the Capacity of the Energy Harvesting Channel? Dor Shaviv EE Dept., Stanford University shaviv@stanford.edu Ayfer Özgür EE Dept., Stanford University aozgur@stanford.edu Haim Permuter

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information

Lecture 4: Proof of Shannon s theorem and an explicit code

Lecture 4: Proof of Shannon s theorem and an explicit code CSE 533: Error-Correcting Codes (Autumn 006 Lecture 4: Proof of Shannon s theorem and an explicit code October 11, 006 Lecturer: Venkatesan Guruswami Scribe: Atri Rudra 1 Overview Last lecture we stated

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page. EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported

More information

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen) UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, 2017 Solutions to Take-Home Midterm (Prepared by Pinar Sen) 1. (30 points) Erasure broadcast channel. Let p(y 1,y 2 x) be a discrete

More information

Covert Communication with Channel-State Information at the Transmitter

Covert Communication with Channel-State Information at the Transmitter Covert Communication with Channel-State Information at the Transmitter Si-Hyeon Lee Joint Work with Ligong Wang, Ashish Khisti, and Gregory W. Wornell July 27, 2017 1 / 21 Covert Communication Transmitter

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Capacity bounds for multiple access-cognitive interference channel

Capacity bounds for multiple access-cognitive interference channel Mirmohseni et al. EURASIP Journal on Wireless Communications and Networking, :5 http://jwcn.eurasipjournals.com/content///5 RESEARCH Open Access Capacity bounds for multiple access-cognitive interference

More information

4 An Introduction to Channel Coding and Decoding over BSC

4 An Introduction to Channel Coding and Decoding over BSC 4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the

More information

ECEN 655: Advanced Channel Coding

ECEN 655: Advanced Channel Coding ECEN 655: Advanced Channel Coding Course Introduction Henry D. Pfister Department of Electrical and Computer Engineering Texas A&M University ECEN 655: Advanced Channel Coding 1 / 19 Outline 1 History

More information

for some error exponent E( R) as a function R,

for some error exponent E( R) as a function R, . Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Physical Layer and Coding

Physical Layer and Coding Physical Layer and Coding Muriel Médard Professor EECS Overview A variety of physical media: copper, free space, optical fiber Unified way of addressing signals at the input and the output of these media:

More information

COMM901 Source Coding and Compression. Quiz 1

COMM901 Source Coding and Compression. Quiz 1 German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Winter Semester 2013/2014 Students Name: Students ID: COMM901 Source Coding

More information

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

Source and Channel Coding for Correlated Sources Over Multiuser Channels

Source and Channel Coding for Correlated Sources Over Multiuser Channels Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Lecture 4 Capacity of Wireless Channels

Lecture 4 Capacity of Wireless Channels Lecture 4 Capacity of Wireless Channels I-Hsiang Wang ihwang@ntu.edu.tw 3/0, 014 What we have learned So far: looked at specific schemes and techniques Lecture : point-to-point wireless channel - Diversity:

More information

Lecture 6 Channel Coding over Continuous Channels

Lecture 6 Channel Coding over Continuous Channels Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have

More information

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

Chapter 9. Gaussian Channel

Chapter 9. Gaussian Channel Chapter 9 Gaussian Channel Peng-Hua Wang Graduate Inst. of Comm. Engineering National Taipei University Chapter Outline Chap. 9 Gaussian Channel 9.1 Gaussian Channel: Definitions 9.2 Converse to the Coding

More information

Quiz 2 Date: Monday, November 21, 2016

Quiz 2 Date: Monday, November 21, 2016 10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

On the Capacity of the Two-Hop Half-Duplex Relay Channel

On the Capacity of the Two-Hop Half-Duplex Relay Channel On the Capacity of the Two-Hop Half-Duplex Relay Channel Nikola Zlatanov, Vahid Jamali, and Robert Schober University of British Columbia, Vancouver, Canada, and Friedrich-Alexander-University Erlangen-Nürnberg,

More information

On the Capacity Region of the Gaussian Z-channel

On the Capacity Region of the Gaussian Z-channel On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu

More information

3F1 Information Theory, Lecture 3

3F1 Information Theory, Lecture 3 3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2011, 28 November 2011 Memoryless Sources Arithmetic Coding Sources with Memory 2 / 19 Summary of last lecture Prefix-free

More information

ELEMENT OF INFORMATION THEORY

ELEMENT OF INFORMATION THEORY History Table of Content ELEMENT OF INFORMATION THEORY O. Le Meur olemeur@irisa.fr Univ. of Rennes 1 http://www.irisa.fr/temics/staff/lemeur/ October 2010 1 History Table of Content VERSION: 2009-2010:

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

On Function Computation with Privacy and Secrecy Constraints

On Function Computation with Privacy and Secrecy Constraints 1 On Function Computation with Privacy and Secrecy Constraints Wenwen Tu and Lifeng Lai Abstract In this paper, the problem of function computation with privacy and secrecy constraints is considered. The

More information

EE5585 Data Compression May 2, Lecture 27

EE5585 Data Compression May 2, Lecture 27 EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy Haykin_ch05_pp3.fm Page 207 Monday, November 26, 202 2:44 PM CHAPTER 5 Information Theory 5. Introduction As mentioned in Chapter and reiterated along the way, the purpose of a communication system is

More information

The Capacity Region for Multi-source Multi-sink Network Coding

The Capacity Region for Multi-source Multi-sink Network Coding The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.

More information

Relay Networks With Delays

Relay Networks With Delays Relay Networks With Delays Abbas El Gamal, Navid Hassanpour, and James Mammen Department of Electrical Engineering Stanford University, Stanford, CA 94305-9510 Email: {abbas, navid, jmammen}@stanford.edu

More information

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H. Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

Channel Coding for Secure Transmissions

Channel Coding for Secure Transmissions Channel Coding for Secure Transmissions March 27, 2017 1 / 51 McEliece Cryptosystem Coding Approach: Noiseless Main Channel Coding Approach: Noisy Main Channel 2 / 51 Outline We present an overiew of linear

More information

Lecture 18: Gaussian Channel

Lecture 18: Gaussian Channel Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

Lecture 4 Capacity of Wireless Channels

Lecture 4 Capacity of Wireless Channels Lecture 4 Capacity of Wireless Channels I-Hsiang Wang ihwang@ntu.edu.tw 3/0, 014 What we have learned So far: looked at specific schemes and techniques Lecture : point-to-point wireless channel - Diversity:

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1st Semester 2010/11 Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1. Convexity of capacity region of broadcast channel. Let C R 2 be the capacity region of

More information