ELEC546 Review of Information Theory

Size: px
Start display at page:

Download "ELEC546 Review of Information Theory"

Transcription

1 ELEC546 Review of Information Theory Vincent Lau 1/1/004 1

2 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random variable with alphabet X and probability mass function p(x), If X is a continuous random variable with p.d.f. f(x), hx ( ) = f( x)log f( xdx ) Joint Entropy ( ) = ( )log ( ) = [ log ( )] H X p x p x E p x x Χ H( X, X ) = p x, x log p x, x ( ) ( ) x, x 1 Conditional Entropy H( X X1) = p( x1, x) log p( x x1 ) = EX [ ] 1, X log p( X X1) 1/1/004 x1, x

3 Entropy Properties of Entropy: H( X) 0 Lower Bound: Upper Bound: Discrete X: H( X) log X equality iff p( X) = 1/ X Continuous X: H( X) (1/ ) log πσ e X equality holds iff X ~ Ν µ, σ H( X, X ) = H( X ) + H( X X ) ( ) ( X ) Chain Rule: Conditioning reduces Entropy: H( X Y) H( X) equality holds iff X &Y are independent Concavity of entropy: H ( X) is a concave function of p( x) If X & Y are independent, then H( X + Y) H( X) Fano s Inequality: Given two random variables X & Y, let Xˆ = gy ( ) be an estimate of X given Y. Define the probability of error P ˆ e = Pr X X, we have: { } ( ) + log ( 1 ) ( ) H P P X H X Y X Y Xˆ e e 1/1/004 3

4 Mutual Information Consider two random variables X & Y with joint pdf p(x,y). The mutual information between X & Y is given by: p( x, y) p( X, Y) I( X; Y) = p( x, y) log = EXY, log xy, p( x) p( y) p( X) p( Y) I( X; Y) = H( X) H( X Y) Hence, mutual information is the reduction in uncertainty of X due to knowledge of Y mutual information represents the amount of information communicated to the receiver. X Y 1/1/004 4

5 Mutual Information Conditional Mutual Information: I( XY ; Z) = HX ( Z) HX ( YZ) Properties of Mutual Information Symmetry: I X Y = Self Information: I( X; X) = H( X) Lower Bound: ( ; ) I( Y; X) I( X; Y) = H( X) + H( Y) H( X; Y) I( X; Y) 0 equality holds iff X & Y are independent 1/1/004 5

6 Mutual Information Properties of Mutual Information Chain Rule of Information: I( X, X,..., X ; Y) = I( X ; Y X,..., X ) 1 N n n 1 1 n= 1 Let ( XY, )~ pxy (, ) = pxpy ( ) ( x). N The mutual information I(X;Y) is a concave function of p(x) for a given p(y x). The mutual information I(X;Y) is a convex function of p(y x) for a given p(x). 1/1/004 6

7 Mutual Information Properties of Mutual Information Data Processing Inequality: X, Y, Z are said to form a Markov chain { X Y Z} if p(x,y,z)=p(x)p(y x)p(z y). If { X Y Z}, then I( X; Y) I( X; Z). Equality holds iff X Z Y Corollary: Function of the data Y could not increase the information about X. If X Y Z then I( X; Y Z) I( X; Y) The dependency of X & Y is decreased by the observation of a downstream random variable Z. If Z = gy ( ), then X Y gy ( ) and I( XY ; ) I( X; gy ( )) 1/1/004 7

8 Asymptotic Equipartition Law of Large Number: If X,..., X ~ iid... random variables, then AEP Theorem: If X1 XN p x p X1 XN H X. Typical Set 1 n= 1 N N 1 lim Xn = E[ X] in probability N N 1,..., are i.i.d. ~ ( ), then log (,..., ) ( ) in probability N { N N( H ) ( ) ( ) ( X ) + ε N H ( X ) ε } 1,.., N : ( 1,.., N) A = x x X p x x ε 1/1/004 8

9 Asymptotic Equipartition Properties of AEP: 1 If ( x1,..., x N) Aε, then H( X) ε log p( x1,..., x N) H( X) + ε N {( 1 N ) ε } Pr x,..., x A > 1 ε for sufficiently large N A ( N) N( H( X) + ε ) ε for sufficiently large N A ( N) ε ( ε ) ( ( ) ε ) N H X 1 for sufficiently large N 1/1/004 9

10 Jointly Typical Sequences Definition: A ( N ) ε N N N N 1 N ( x, y ) X Y : log p( x ) H( X) < ε, N = 1 N 1 N N log p( y ) H( Y) < ε, log p( x, y ) H( X, Y) < ε N N Theorem (Joint AEP): Let (X N,Y N ) be sequences of length N drawn N N N i.i.d. according to p ( x, y ) = p( xn, yn) ( ) {( x ) } N y N A N ε Pr, 1 N N ( N) N( I( X; Y) + 3ε ) 1/1/004 and Pr X, Y Aε 1 ε for sufficiently large N 10 A n= 1 ( N ) ε ( (, ) + ε ) N H X Y ( ) ( ) ( ) ( ) ( ) { } ( ) ( ) N( I( X; Y) 3ε ) { ε } N N N N N N N If X, Y ~ p x p y, then Pr X, Y A

11 Properties: Jointly Typical Sequence ( ) There are about NH X typical X sequences ( ) There are about NH Y typical Y sequences However, since there are (, ) only NH X Y jointly typical sequences, not all pairs of typical X and typical Y are also jointly typical. The probability that any randomly chosen pair is jointly typical is about ( ; ) NI X Y 1/1/004 11

12 Channel Coding Theorem Channel Encoder A mapping from message set to transmitted sequence. Set of codewords X is called a codebook. Code rate ( ) R= log M / N Generic Channel { () 1,..., X ( M) } Probability mapping from X N N to Y p( y x ) Channel Decoder A deterministic decoding function from Y N message index. Error Probability N { g( Y ) m m } Pr is transmitted 1/1/004 1

13 Channel Coding Theorem Discrete Memoryless Channel: Definition of Channel Capacity: N N ( ) p Y X = p( Yn Xn) p( X) N n= 1 C = max I( X; Y) 1/1/004 13

14 Examples of Channel Capacity Binary Symmetric Channel X ~ Binary Input {0,1}; Y ~ Binary Output {0,1} 1 p p py ( x) = p 1 p Error probability is independent of the transmit input bit. Example Channel: Channel Capacity: ( ) C = max I( X; Y) = 1+ plog p+ (1 p)log 1 p 1/1/ p( x) p(0) = p(1) = 0.5

15 Examples of Channel Capacity Discrete Input Continuous Output Channels Input X ~ discrete alphabets, Channel output Y ~ continuous (unquantized). N N Channel is characterized by the transition probability f( y x ) Example case (binary input, continuous output, AWGN) y = x + z z ~ zero mean white Gaussian noise σ n n n n z N N f ( y x ) f( y x ) = n n f( yn 0)~ N( A, σ z ) f( yn 1)~ N( A, σ z ) n Mutual information is maximized when p(0)=p(1)=0.5. Channel Capacity is given by: 1/1/004 py ( A) py ( A) C = 0.5 f( y A)log dy+ 0.5 f( y A)log dy 1 py ( ) py ( ) y y 15

16 Examples of Channel Capacity Discrete Input Continuous Output Channels 1/1/004 16

17 Examples of Channel Capacity Discrete Input Continuous Output Channels 1/1/004 17

18 Examples of Channel Capacity Continuous Input Continuous Output Channels Both input symbol and output symbol are continuous random variable. (Infinitely dense constellation). Example (AWGN Channel): y = x+ z z ~ N(0, σ z ) Channel Capacity: f( y x)~ N( x, σ z ) I( X; Y) = H( Y) H( Y Z) = H( Y) H( Z) Since H(Z) is independent of p(x), the mutual information is maximized when H(Y) is maximized Y is Gaussian X is Gaussian. p( x) ( x z ) C = max I( X; Y) = 0.5log 1 + σ / σ E X = σ x 1/1/004 18

19 Channel Capacity for continuous time AWGN channel 1/1/004 19

20 Bandwidth Efficiency 1/1/004 0

21 Bandwidth Efficiency 1/1/004 1

22 Bandwidth Efficiency 1/1/004

23 Bandwidth Efficiency 1/1/004 3

24 System Performance Spectral efficiency vs power efficiency Various code performance assume to operate at 3 6 BER ~10 10 High Bandwidth Efficiency ~ M-ary modulation High Bandwidth Expansion ~ Orthogonal Modulation, CDMA 1/1/004 4

25 Channel Capacity of Fading Flat fading channel: Channels Memoryless Fading Channel: ( ) ( ) y = h x + z h ~ CN 0,0.5, z ~ CN 0, σ n n n n n n z p( y x, h ) = p( y x, h ) N N N n n n n= 1 Encoding frame spans over multiple fading coefficients. Fading coefficients are i.i.d. between symbols. Average Power Constraint: Average transmitted power across a coding frame is constrained (short term average). Ergodic Channel Capacity When the transmission time (over a coding frame) is long enough >> Coherence Time, the long term ergodic property of the fading process is revealed. Finite average channel capacity is achievable. N 1/1/004 5

26 Channel Capacity of Fading Channels Channel Capacity of Fading Channels Fast Fading Channels Ergodic Capacity Capacity is a deterministic no. Zero packet error If R < capacity Capacity with CSIR Capacity with CSIT Capacity with CSIR & CSIT Slow Fading Channels Outage Capacity Capacity is itself a random variable Cannot guarantee zero packet error. Capacity with CSIR Capacity with CSIT Capacity with CSIR & CSIT Capacity with no CSIR, no CSIT Capacity with no CSIR, no CSIT 1/1/004 6

27 Channel Capacity of Fading Ergodic Channel Capacity (fast fading): Channels Case 1: Perfect CSIR only Channel Capacity: p( X) p( X) p( X) p( X) { } C = max IXYH ( ;, ) = max IXY ( ; H) + IXH ( ; ) CSIR ( ) = max I( X; Y H) = max E I X; Y H = h Example: - Rayleigh fading σγ x C = max EH [ I( X; Y H = h) ] = log 1 f ( γ ) dγ p( X) + σ 0 z 1/1/004 7

28 Channel Capacity of Fading Channels Case 3: Perfect CSIR & CSIT (fast fading) The channel capacity is given by: CCSIT, CSIR = E h max I( X; Y H = h) p( X h) Example (Temporal Power Water-filling): For Rayleigh fading channel with perfect CSIT & CSIR, the capacity achieving distribution p(x h) is complex Gaussian σ x ( h) with power. The optimal strategy is to adaptive power over a temporal domain: ( ) ( ) 1/1/004 C = C γ f γ dγ 8 γ γ = h

29 Example Temporal Power Water-filling The optimization problem: Choose optimal transmit power allocation σ X to maximize the channel capacity at an average transmit power constraint (averaged over one encoding frame). ( ) σ x γ γ max L( λσ, ) max log ( ) x = 1 λσ x γ σx σ + x σ z + ( ) ( ) ( ) 1 σ z log γ / λσ z γ λσ z L / σ x = 0 σ x ( γ ) = C = C γ f γ dγ C ( γ ) = λ γ γ 0 otherwise The solution temporal power water-filling - when the channel fading is poor, reduce or shut down the tx power - when the channel fading is good, increase the tx power γ = h 1/1/004 9

30 Channel Capacity of Slow Fading Channels Outage Capacity: Ergodic assumption is not always valid {e.g. over the entire encoding frame, the fading process is non-ergodic}. Example: Quasi-static fading channel: Channel fading coefficient is constant within an encoding frame. For the case with perfect CSIR only, the instantaneous channel capacity is a function of channel fading the instantaneous channel capacity is itself a random variable. There is no guarantee on error-free transmission of the coded frame. zero ergodic capacity. There may not be classical Shannon s meaning attached to the capacity. {in other words, there is a finite probability that the actual transmission rate, no matter how small it is, exceeds the instantaneous mutual information. The effective capacity is quoted as outage capacity. {i.e. a capacity together with it s c.d.f.} 1/1/004 30

31 Outage Capacity Example: Rayleigh fading with perfect CSIR: The instantaneous capacity is a random variable given by: γσ X C ( γ ) = log 1+ σ z The outage probability is given by: ( ) ( ) { } ( ) z r Pout r = Pr C γ r = 1 exp 1 σ x P(r)=0 r=0; only the zero rate is compatible with zerooutage, thus eliminating any reliable communication in Shannon s sense. Taking into L retransmission, the average goodput (average number of packets successfully delivered to the receiver) is given by: ρ = rpr{ r C( γ )} 1/1/ σ

32 The role of CSIT In fast fading channels, CSIT allows power adaptation increases the ergodic capacity: C CSIT, CSIR C CSIR In slow fading channels, CSIT allows power + rate adaptation achieve zero outage probability 100% reliability of packet transmission is possible even in slow fading with CSIT. P (, ) 0 outage CSIT CSIR 1/1/004 3

33 Summary of Main Points Entropy Measure the degree of randomness of a random variable Physical meaning (by AEP theorem): Consider a sequence of i.i.d. source symbols (X1,,Xn), the size of the typical set ~ NH(X) # of bits required to encode the source symbol ~ H(X) bits per symbol for large N. Mutual Information Measure the reduction of entropy on X by observation of Y. Physical Meaning (by channel coding theorem) Maximum number of information bits per channel use that can be transmitted over a channel with arbitrarily low error probability. C=max_{p(x)} I(X;Y) 1/1/004 33

34 Summary of Main Points Channel Capacity: AWGN channels discrete time models Continuous time models Fast Flat Fading Channels CSIR only CSIT, CSIR Slow Flat Fading CSIR only packet outage C CSIR and CSIR power adaptation + rate C adaptation at the transmitter no packet outage and achieving the ergodic capacity. C C AWGN AWGN σ x = log 1 + bits per ch use σ z P av = W log 1 + bits/sec Wη0 σ h log 1 bits/ch use x fastfading, CSIR = E + σ z + h = E log bits/ch use fastfading, CSIT, CSIR λσ z 1/1/004 34

35 Appendix - Advanced Topics 1/1/004 35

36 Appendix A: Proof of Shannon s Coding Theorem 1/1/004 36

37 Random Coding Achievability Proof NR Fix p(x) and we generate a independent N codewords at random N according to the distribution p( x ) = p( xn) we have the random n= 1 codebook given by the matrix: x1 (1) x (1) N Ω= R R R NR NR x1 ( ) xn ( ) The code is then revealed to both the transmitter and the receiver. A message W is chosen according to a uniform distribution Pr{ } NR NR W = w =, w { 1,,..., } The w-th codeword is sent out of the transmitter. The receiver receives a sequence Y N according to the distribution N N N ( ( )) n n( ) n= 1 ( ) P y x w = p y x w 1/1/004 37

38 Random Coding Achievability Proof The receiver guesses which message was transmitted based on typical set decoding. the receiver declares Wˆ was transmitted if N ( ˆ ), N X W Y is jointly typical ( ) There is no other index k, such that ( ) ( X N ( k), Y N ) A N ε If no such Wˆ exists or if there is more than one such, then an error is declared. There is a decoding error if Wˆ W 1/1/004 38

39 Random Coding Achievability Proof We shall calculate error probability averaged over all possible codebooks. ( N) NR ( ) ( ) ( )( 1/ ) λ ( ) P = P Ω P Ω = P Ω Ω = e e w Ω Ω w NR ( 1/ ) P( Ω) λw ( Ω ) = P( Ω) λ1 ( Ω ) = Pr ( E W = 1) w Ω ( ) { ˆ N Pr ( ) is transmitted} λw Ω = W w X w By symmetry of code construction, the average probability of error averaged over all codes does not depend on the particular index sent. Without loss of generality, assume message W=1 is transmitted. Define the events: i-th codeword and Y N are jointly typical ( ) {( ) ε } N N N NR E = X ( i), Y A i [1,.., ] i 1/1/ Ω

40 Random Coding Achievability Proof Error occurs when E E E c 1 NR The transmitted codeword and the received sequence are not jointly typical or A wrong codeword is jointly typical with the received sequence Based on Union Bound, we have: NR c c Pe = Pr{ E1 E E NR } P( E ) ( ) 1 + P En n= By Jointly AEP Theorem, P( E c 1 ) ε for sufficiently large N. Since by random code generation, X N (1) and X N (i) are independent and so are Y N and X N (i), we have by Jointly AEP Theorem that: ( ) P E i ( ( ; ) 3ε ) N I X Y Hence, the average error probability is bounded by: P e NR i= ( ( ; ) 3 ε) NR ( ( ; ) 3 ε) 3Nε ( ( ; ) ) ( ) ε + = ε + 1 ε + N I X Y N I X Y N I X Y R 1/1/004 40

41 Random Coding Achievability Proof Hence, if R< I( X; Y) 3ε, we could choose ε and N such that P e ε Since the average error probability is averaged over all codebooks, * there exists at least one codebook Ω with a small average probability of error P * e ( Ω ) ε. This proved that there exists at least an achievable code with rate R < I(X;Y) for arbitrarily large N. Although the theorem shows that there exists good codes, it does not provide a way of constructing the best codes. We could generate a good code by randomly generating the codewords in a codebook. However, without any structure in the codewords, it is very difficult to decode at large N. 1/1/004 41

42 Converse Proof ( ) We have to prove that any sequence of NR ( N ), N codes with Pe 0 must have R C. N ( N) ( N) N Fano s Inequality: H( W Y ) 1 + Pe NR, where Pe = Pr g( Y ) W Let Y N be the result of passing XN through a discrete memoryless channel. Then I X N ; Y N NC for all p( x N ) ( ) Let W be the message index drawn according to a uniform distribution over { 1,,..., NR }. Hence, we have: N N N N N NR= HW ( ) = HW ( Y ) + IWY ( ; ) HW ( Y ) + I( X ( W); Y ) 1 + P NR + I( X ( W ); Y ) 1+ P NR + NC ( N) N N ( N) e e R P R+ 1/ N + C P 1 C/ R 1/ NR ( N) ( N) e e Hence, if R>C, the error probability is bounded away from 0. 1/1/004 4

43 Appendix B: Proof of Continuous time AWGN Capacity. 1/1/004 43

44 Examples of Channel Capacity Continuous Time Waveform Channels Channel input & output are continuous time random signals (instead of discrete time symbols). yt () = xt () + zt () Vector Representation of continuous time signals T s For any random signal x(t) with finite energy xt () dt<, we have 0 N where { φn () t } is a set of orthogonal basis n= lim E x() t xnφn() t = 0 1,..., N N T n= 1 s * x xt (), φ () t x t φ t dt function = = () () n n n 0 For bandlimited signal x(t) with bandwidth W and duration T where WT >>1, the signal space dimension approaches WT. Converting the continuous time AWGN channel into vector representation (over a dimension of WT), we have: xt ( ) x= x,..., x z'( t) z ' = z,..., z y = x+ z' ( ) ( ) 1 WT 1 WT Note that z(t) requires a higher dimension signal space. Yet, y represents a sufficient statistics on x the noise components in the noise vector z are Gaussian i.i.d. Total noise power = 1 E z ' = ηw ( W) E z = η W T η0 E z n = 0 n 0 1/1/004 44

45 Examples of Channel Capacity Continuous Time Waveform Channel The channel transition probability: p( y x) = p( y x ) p( y x ) = 1 exp ( y x ) WT n n n n n n n= 1 πη η 0 0 Treating the signal vector x as one super-symbol, the asymptotic channel capacity (per unit time) is given by: ~ bits per second 1 C = lim max I( XY ; ) T p( x) T WT max I( XY ; ) = max I( X ; Y ) = WTlog 1+ n n p( x) p( x ) n= 1 n η0 Since the average transmitted power is given by: T WT av () x n σ x σ x 0 n= 1 the channel capacity (bit per second) is given by: 1/1/ σ x / Pav P = x t dt = = W x W = T T WT W P av C = W log 1+ Wη0

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

ELEC546 MIMO Channel Capacity

ELEC546 MIMO Channel Capacity ELEC546 MIMO Channel Capacity Vincent Lau Simplified Version.0 //2004 MIMO System Model Transmitter with t antennas & receiver with r antennas. X Transmitted Symbol, received symbol Channel Matrix (Flat

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Lecture 4. Capacity of Fading Channels

Lecture 4. Capacity of Fading Channels 1 Lecture 4. Capacity of Fading Channels Capacity of AWGN Channels Capacity of Fading Channels Ergodic Capacity Outage Capacity Shannon and Information Theory Claude Elwood Shannon (April 3, 1916 February

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 8: Differential entropy Chapter 8 outline Motivation Definitions Relation to discrete entropy Joint and conditional differential entropy Relative entropy and mutual information Properties AEP for

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157 Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

Chapter 4: Continuous channel and its capacity

Chapter 4: Continuous channel and its capacity meghdadi@ensil.unilim.fr Reference : Elements of Information Theory by Cover and Thomas Continuous random variable Gaussian multivariate random variable AWGN Band limited channel Parallel channels Flat

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Block 2: Introduction to Information Theory

Block 2: Introduction to Information Theory Block 2: Introduction to Information Theory Francisco J. Escribano April 26, 2015 Francisco J. Escribano Block 2: Introduction to Information Theory April 26, 2015 1 / 51 Table of contents 1 Motivation

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Communications Theory and Engineering

Communications Theory and Engineering Communications Theory and Engineering Master's Degree in Electronic Engineering Sapienza University of Rome A.A. 2018-2019 AEP Asymptotic Equipartition Property AEP In information theory, the analog of

More information

Capacity of AWGN channels

Capacity of AWGN channels Chapter 3 Capacity of AWGN channels In this chapter we prove that the capacity of an AWGN channel with bandwidth W and signal-tonoise ratio SNR is W log 2 (1+SNR) bits per second (b/s). The proof that

More information

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 2: Entropy and Mutual Information Chapter 2 outline Definitions Entropy Joint entropy, conditional entropy Relative entropy, mutual information Chain rules Jensen s inequality Log-sum inequality

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

Single-User MIMO systems: Introduction, capacity results, and MIMO beamforming

Single-User MIMO systems: Introduction, capacity results, and MIMO beamforming Single-User MIMO systems: Introduction, capacity results, and MIMO beamforming Master Universitario en Ingeniería de Telecomunicación I. Santamaría Universidad de Cantabria Contents Introduction Multiplexing,

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless

More information

An Extended Fano s Inequality for the Finite Blocklength Coding

An Extended Fano s Inequality for the Finite Blocklength Coding An Extended Fano s Inequality for the Finite Bloclength Coding Yunquan Dong, Pingyi Fan {dongyq8@mails,fpy@mail}.tsinghua.edu.cn Department of Electronic Engineering, Tsinghua University, Beijing, P.R.

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information

Lecture 4 Capacity of Wireless Channels

Lecture 4 Capacity of Wireless Channels Lecture 4 Capacity of Wireless Channels I-Hsiang Wang ihwang@ntu.edu.tw 3/0, 014 What we have learned So far: looked at specific schemes and techniques Lecture : point-to-point wireless channel - Diversity:

More information

Homework Set #2 Data Compression, Huffman code and AEP

Homework Set #2 Data Compression, Huffman code and AEP Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code

More information

Lecture 4 Capacity of Wireless Channels

Lecture 4 Capacity of Wireless Channels Lecture 4 Capacity of Wireless Channels I-Hsiang Wang ihwang@ntu.edu.tw 3/0, 014 What we have learned So far: looked at specific schemes and techniques Lecture : point-to-point wireless channel - Diversity:

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

Solutions to Set #2 Data Compression, Huffman code and AEP

Solutions to Set #2 Data Compression, Huffman code and AEP Solutions to Set #2 Data Compression, Huffman code and AEP. Huffman coding. Consider the random variable ( ) x x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0. 0.04 0.04 0.03 0.02 (a) Find a binary Huffman code

More information

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page. EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

Information Capacity of an Energy Harvesting Sensor Node

Information Capacity of an Energy Harvesting Sensor Node Information Capacity of an Energy Harvesting Sensor Node R Rajesh, Vinod Sharma and Pramod Viswanath arxiv:1212.3177v1 [cs.it] 13 Dec 2012 Abstract Energy harvesting sensor nodes are gaining popularity

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Lecture 7 MIMO Communica2ons

Lecture 7 MIMO Communica2ons Wireless Communications Lecture 7 MIMO Communica2ons Prof. Chun-Hung Liu Dept. of Electrical and Computer Engineering National Chiao Tung University Fall 2014 1 Outline MIMO Communications (Chapter 10

More information

Lecture 15: Thu Feb 28, 2019

Lecture 15: Thu Feb 28, 2019 Lecture 15: Thu Feb 28, 2019 Announce: HW5 posted Lecture: The AWGN waveform channel Projecting temporally AWGN leads to spatially AWGN sufficiency of projection: irrelevancy theorem in waveform AWGN:

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

Chapter 9. Gaussian Channel

Chapter 9. Gaussian Channel Chapter 9 Gaussian Channel Peng-Hua Wang Graduate Inst. of Comm. Engineering National Taipei University Chapter Outline Chap. 9 Gaussian Channel 9.1 Gaussian Channel: Definitions 9.2 Converse to the Coding

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

The Capacity Region for Multi-source Multi-sink Network Coding

The Capacity Region for Multi-source Multi-sink Network Coding The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

Lecture 18: Gaussian Channel

Lecture 18: Gaussian Channel Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400

More information

Information Theory - Entropy. Figure 3

Information Theory - Entropy. Figure 3 Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system

More information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information 4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk

More information

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding...

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding... Contents Contents i List of Figures iv Acknowledgements vi Abstract 1 1 Introduction 2 2 Preliminaries 7 2.1 Superposition Coding........................... 7 2.2 Block Markov Encoding.........................

More information

Lecture 5: Asymptotic Equipartition Property

Lecture 5: Asymptotic Equipartition Property Lecture 5: Asymptotic Equipartition Property Law of large number for product of random variables AEP and consequences Dr. Yao Xie, ECE587, Information Theory, Duke University Stock market Initial investment

More information

Capacity of Memoryless Channels and Block-Fading Channels With Designable Cardinality-Constrained Channel State Feedback

Capacity of Memoryless Channels and Block-Fading Channels With Designable Cardinality-Constrained Channel State Feedback 2038 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 9, SEPTEMBER 2004 Capacity of Memoryless Channels and Block-Fading Channels With Designable Cardinality-Constrained Channel State Feedback Vincent

More information

Gaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26

Gaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26 Gaussian channel Information theory 2013, lecture 6 Jens Sjölund 8 May 2013 Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26 Outline 1 Definitions 2 The coding theorem for Gaussian channel 3 Bandlimited

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 15: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 29 th, 2015 1 Example: Channel Capacity of BSC o Let then: o For

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Complex Systems Methods 2. Conditional mutual information, entropy rate and algorithmic complexity

Complex Systems Methods 2. Conditional mutual information, entropy rate and algorithmic complexity Complex Systems Methods 2. Conditional mutual information, entropy rate and algorithmic complexity Eckehard Olbrich MPI MiS Leipzig Potsdam WS 2007/08 Olbrich (Leipzig) 26.10.2007 1 / 18 Overview 1 Summary

More information

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited Ch. 8 Math Preliminaries for Lossy Coding 8.4 Info Theory Revisited 1 Info Theory Goals for Lossy Coding Again just as for the lossless case Info Theory provides: Basis for Algorithms & Bounds on Performance

More information

EE 376A: Information Theory Lecture Notes. Prof. Tsachy Weissman TA: Idoia Ochoa, Kedar Tatwawadi

EE 376A: Information Theory Lecture Notes. Prof. Tsachy Weissman TA: Idoia Ochoa, Kedar Tatwawadi EE 376A: Information Theory Lecture Notes Prof. Tsachy Weissman TA: Idoia Ochoa, Kedar Tatwawadi January 6, 206 Contents Introduction. Lossless Compression.....................................2 Channel

More information

Joint Write-Once-Memory and Error-Control Codes

Joint Write-Once-Memory and Error-Control Codes 1 Joint Write-Once-Memory and Error-Control Codes Xudong Ma Pattern Technology Lab LLC, U.S.A. Email: xma@ieee.org arxiv:1411.4617v1 [cs.it] 17 ov 2014 Abstract Write-Once-Memory (WOM) is a model for many

More information

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence

More information

x log x, which is strictly convex, and use Jensen s Inequality:

x log x, which is strictly convex, and use Jensen s Inequality: 2. Information measures: mutual information 2.1 Divergence: main inequality Theorem 2.1 (Information Inequality). D(P Q) 0 ; D(P Q) = 0 iff P = Q Proof. Let ϕ(x) x log x, which is strictly convex, and

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

Lecture 11: Continuous-valued signals and differential entropy

Lecture 11: Continuous-valued signals and differential entropy Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel Solutions to Homework Set #4 Differential Entropy and Gaussian Channel 1. Differential entropy. Evaluate the differential entropy h(x = f lnf for the following: (a Find the entropy of the exponential density

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately

More information

Approximately achieving the feedback interference channel capacity with point-to-point codes

Approximately achieving the feedback interference channel capacity with point-to-point codes Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

On the Capacity of the Two-Hop Half-Duplex Relay Channel

On the Capacity of the Two-Hop Half-Duplex Relay Channel On the Capacity of the Two-Hop Half-Duplex Relay Channel Nikola Zlatanov, Vahid Jamali, and Robert Schober University of British Columbia, Vancouver, Canada, and Friedrich-Alexander-University Erlangen-Nürnberg,

More information

19. Channel coding: energy-per-bit, continuous-time channels

19. Channel coding: energy-per-bit, continuous-time channels 9. Channel coding: energy-per-bit, continuous-time channels 9. Energy per bit Consider the additive Gaussian noise channel: Y i = X i + Z i, Z i N ( 0, ). (9.) In the last lecture, we analyzed the maximum

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information