On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities

Size: px
Start display at page:

Download "On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities"

Transcription

1 On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities Alex Dytso, Ronit Bustin, Daniela Tuninetti, Natasha Devroye, H. Vincent Poor, and Shlomo Shamai (Shitz)

2 Outline Notation and Past Work Minimum Mean p th Error (MMPE) Properties of the MMPE Conditional MMPE Applications to Information Theory Problems

3 Notation X 2 R n, Y snr = p n Tr E[XXT ] apple, snrx + Z, I n (X, snr) = n I(X; Y snr), mmse(x Y snr )=mmse(x, snr) := n Tr (E [cov(x Y snr)]) I-MMSE relationship d dsnr I n(x, snr) = mmse(x, snr) 2 I n (X, snr) = 2 Z snr 0 mmse(x,t)dt D. Guo, S. Shamai, and S. Verdu, Mutual information and minimum mean-square error in Gaussian channels, IEEE Trans. Inf. Theory, vol. 5, no. 4, pp , April 2005.

4 Ozarow-Wyner Bound (Ozarow-Wyner Bound.) Let X D be a discrete random variable then H(X D ) gap apple I(X D ; Y ) apple H(X D ), lmmse(x D Y )= d min (X D )= gap = 2 log e 6 E[X 2 D ] + snr E[X 2 D ] + 2 log + 2 lmmse(x D Y ) d 2 min (X D) inf x i x j. x i,x j 2supp(X D ):i6=j L. Ozarow and A. Wyner, On the capacity of the Gaussian channel with a finite number of input levels, IEEE Trans. Inf. Theory, vol. 36, no. 6, pp , Nov 990., Take X D to be PAM with N = p + snr then 2 log( + snr) I(X D; Y ) = gap 0.75 bit

5 Ozarow-Wyner Bound (Ozarow-Wyner Bound.) Let X D be a discrete random variable then H(X D ) gap apple I(X D ; Y ) apple H(X D ), lmmse(x D Y )= d min (X D )= gap = 2 log e 6 E[X 2 D ] + snr E[X 2 D ] + 2 log + 2 lmmse(x D Y ) d 2 min (X D) inf x i x j. x i,x j 2supp(X D ):i6=j, L. Ozarow and A. Wyner, On the capacity of the Gaussian channel with a finite number of input levels, IEEE Trans. Inf. Theory, vol. 36, no. 6, pp , Nov 990. Can this be improved Take X D to be PAM with N = p + snr then 2 log( + snr) I(X D; Y ) = gap 0.75 bit

6 Ozarow-Wyner Bound (Ozarow-Wyner Bound.) Let X D be a discrete random variable then H(X D ) gap apple I(X D ; Y ) apple H(X D ), lmmse(x D Y )= d min (X D )= gap = 2 log e 6 E[X 2 D ] + snr E[X 2 D ] + 2 log + 2 lmmse(x D Y ) d 2 min (X D) inf x i x j. x i,x j 2supp(X D ):i6=j L. Ozarow and A. Wyner, On the capacity of the Gaussian channel with a finite number of input levels, IEEE Trans. Inf. Theory, vol. 36, no. 6, pp , Nov 990., Improvement gap = e 2 log log + 2 mmse(x D Y ) d 2 min (X D) mmse(x D Y )=E (X D E[X D Y ]) 2. A. Dytso, D. Tuninetti, and N. Devroye, Interference as noise: Friend or foe? IEEE Trans. Inf. Theory, vol. 62, no. 6, pp , 206.,

7 Ozarow-Wyner Bound (Ozarow-Wyner Bound.) Let X D be a discrete random variable then H(X D ) gap apple I(X D ; Y ) apple H(X D ), lmmse(x D Y )= d min (X D )= gap = 2 log e 6 E[X 2 D ] + snr E[X 2 D ] + 2 log + 2 lmmse(x D Y ) d 2 min (X D) inf x i x j. x i,x j 2supp(X D ):i6=j, L. Ozarow and A. Wyner, On the capacity of the Gaussian channel with a finite number of input levels, IEEE Trans. Inf. Theory, vol. 36, no. 6, pp , Nov 990. Improvement gap = e 2 log log + 2 mmse(x D Y ) d 2 min (X D) mmse(x D Y )=E (X D E[X D Y ]) 2., mmse(x D Y ) apple lmmse(x D Y ), lmmse(x D Y )=O, snr mmse(x D Y )=O e snrd 2 min 4. A. Dytso, D. Tuninetti, and N. Devroye, Interference as noise: Friend or foe? IEEE Trans. Inf. Theory, vol. 62, no. 6, pp , 206.

8 Ozarow-Wyner Bound (Ozarow-Wyner Bound.) Let X D be a discrete random variable then H(X D ) gap apple I(X D ; Y ) apple H(X D ), lmmse(x D Y )= d min (X D )= gap = 2 log e 6 E[X 2 D ] + snr E[X 2 D ] + 2 log + 2 lmmse(x D Y ) d 2 min (X D) inf x i x j. x i,x j 2supp(X D ):i6=j, L. Ozarow and A. Wyner, On the capacity of the Gaussian channel with a finite number of input levels, IEEE Trans. Inf. Theory, vol. 36, no. 6, pp , Nov 990. Improvement gap = e 2 log log + 2 mmse(x D Y ) d 2 min (X D) mmse(x D Y )=E (X D E[X D Y ]) 2., 2 log( + snr) I(X D; Y ) 0.75 bit. 2 log( + snr) I(X D; Y ) 0.65 bit. A. Dytso, D. Tuninetti, and N. Devroye, Interference as noise: Friend or foe? IEEE Trans. Inf. Theory, vol. 62, no. 6, pp , 206.

9 Ozarow-Wyner Bound (Ozarow-Wyner Bound.) Let X D be a discrete random variable then H(X D ) gap apple I(X D ; Y ) apple H(X D ), lmmse(x D Y )= d min (X D )= gap = 2 log e 6 E[X 2 D ] + snr E[X 2 D ] + 2 log + 2 lmmse(x D Y ) d 2 min (X D) inf x i x j. x i,x j 2supp(X D ):i6=j, L. Ozarow and A. Wyner, On the capacity of the Gaussian channel with a finite number of input levels, IEEE Trans. Inf. Theory, vol. 36, no. 6, pp , Nov 990. Improvement gap = e 2 log log + 2 mmse(x D Y ) d 2 min (X D) mmse(x D Y )=E (X D E[X D Y ]) 2. A. Dytso, D. Tuninetti, and N. Devroye, Interference as noise: Friend or foe? IEEE Trans. Inf. Theory, vol. 62, no. 6, pp , 206., 2 log( + snr) I(X D; Y ) 0.75 bit. 2 log( + snr) I(X D; Y ) 0.65 bit. Can we do better?

10 MMPE The p-norm of a random vector U 2 R n is given by kuk p p = htr n E p 2 UU T i. For n =, kuk p =E[ U p ]. Example: U N (0, I) nkuk p p =2 p 2 (x) = Z 0 t x n+p 2 n 2, e t dt.

11 MMPE We define the Minimum Mean p-th Error (MMPE) of estimating X from Y as h i mmpe(x Y; p) :=inf kx f f(y)kp p =inf E Err p 2 (X,f(Y)), f q Err = (X f(y)) T (X f(y)). We denote the optimal estimator as f p (X Y). For n = : Example: for p =2 mmpe(x Y ; p) =inf f E [ X f(y ) p ]. mmpe(x Y; p = 2) = mmse(x Y), f p=2 (X Y) =E[X Y]. We shall denote Study Properties mmpe(x Y; p) =mmpe(x, snr,p).

12 Properties of the MMPE Optimal Estimator Theorem. For mmpe(x, snr, p) and p by the following point-wise relationship: 0 the optimal estimator is given h i f p (X Y = y) = arg min E Err p 2 (X, v) Y = y. v2r n Orthogonality-like Property Theorem. For any apple p< optimal estimator f p (X Y) satisfies apple E W T W p 2 2 W T g(y) =0, where W = X f p (X Y) for any deterministic function g( ).

13 Properties of Optimal Estimators E[X Y]. if 0 apple X 2 R then 0 apple E[X Y ], 2. (Linearity) E[aX + b Y] = ae[x Y] + b, 3. (Stability) E[g(Y) Y] = g(y) for any function g( ), 4. (Idempotent) E[E[X Y] Y] =E[X Y], 5. (Degradedness)0 E [X Y snr, Y snr0 ]=E[X Y snr0 ], for X! Y snr0! Y snr, 6. (Shift Invariance) mmse(x + a,snr) = mmse(x, snr), 7. (Scaling) mmse(ax, snr) = a 2 mmse(x,a 2 snr). f p (X Y). if 0 apple X 2 R then 0 apple f p (X Y ), 2. (Linearity) f p (ax + b Y) =af p (X Y)+b, 3. (Stability) f p (g(y) Y) =g(y) for any function g( ), 4. (Idempotent) f p (f p (X Y) Y) =f p (X Y), 5. (Degradedness) f p (X Y snr0, Y snr )=f p (X Y snr0 ), for X! Y snr0! Y snr, 6. (Shift) mmpe(x + a, snr,p)=mmpe(x, snr,p), 7. (Scaling) mmpe(ax, snr,p)=a p mmpe(x,a 2 snr,p). E[E[X Y ]] = E [X] E[f p (X Y )] = E [X]

14 Examples of Optimal Estimators if X N (0, I) then for p f p (X Y = y) = p snr +snr if X = {±} equally likely (BPSK) then for p p=2 p=.5 p= p=3 y (i.e., linear ); f p (X Y ) = tanh psnr p y. Function of p f p (X Y=y) S. Sherman, Non-mean-square error criteria, IRE Transactions on Information Theory, vol. 3, no. 4, pp , y

15 Bounds on the MMPE mmse(x, snr) apple min snr, kxk2 2 Theorem. For snr 0, 0 <qapplep, and input X kzk p p mmpe(x, snr,p) apple min, kxk p snr p p, 2 n p q mme p q (X, snr,q) apple mme(x, snr,p) applekx E[X Y]k p p.

16 Bounds on the MMPE Interpolation Bounds Theorem. For any 0 <p<q<rappleand 2 (0, ) such that where =, In particular, q = p + r () = q r p r, (a) mmpe q (X, snr,q) apple inf f kx f(y)k p kx f(y)k r. (b) mmpe q (X, snr,q) applekx fr (X Y)k p mmpe r (X, snr,r), (c) mmpe q (X, snr,q) apple mmpe p (X, snr,p)kx fp (X Y)k r. (d)

17 MMPE with Gaussian Input (Estimation of Gaussian Input.) For X G N (0, 2 I) and p mope(x G, snr,p)= with the optimal estimator given by f p (X G Y = y) = p kzk p p ( + snr 2 ) p 2 2 p snr + snr 2 y., (Asymptotic Optimality of Gaussian Input.) For every p, and a random variable X such that kxk p p apple p kzk p p,wehave mmpe(x, snr,p) apple k p, 2 snr p kzk p p ( + snr 2 ) p 2. where 8 >< k p, 2 snr = p =2, >: apple k p p,snr = +p p snr +snr apple + p +snr p 6= 2.

18 Conditional MMPE We define the conditional MMPE as mmpe(x, snr,p U) :=kx f p (X Y snr, U)k p p. Additional Information (Conditioning Reduces MMPE.)For every snr 0, p 0, and random variable X, wehave mmpe(x, snr,p) mmpe(x, snr,p U).

19 Bounds on the MMPE (Extra Independent Noisy Observation.) Let U = p X + Z where Z N (0, I) and where (X, Z, Z ) are mutually independent. Then mmpe(x, snr,p U) =mmpe(x, snr +,p). Consequences: MMPE is decreasing function of SNR New Proof of the Single-Crossing-Point Property

20 MMSE Bounds: Single Crossing Point Property (SCPP.) For any fixed X, suppose that mmse(x, snr 0 )= + snr0, for some fixed 0. Then mmse(x, snr) apple + snr, snr 2 [snr 0, ) mmse(x, snr) + snr, snr 2 [0, snr 0). SCPP+I-MMSE important tool for derive converses: Version of EPI BC Wiretap MIMO channels D. Guo, Y. Wu, S. Shamai, and S. Verdu, Estimation in Gaussian noise: Properties of the minimum mean-square error, IEEE Trans. Inf. Theory, vol. 57, no. 4, pp , April 20. R. Bustin, R. Schaefer, H. Poor, and S. Shamai, On MMSE properties of optimal codes for the Gaussian wiretap channel, in Proc. IEEE Inf. Theory Workshop, April 205, pp. 5. R. Bustin, M. Payaro, D. Palomar, and S. Shamai, On MMSE crossing properties and implications in parallel vector Gaussian channels, IEEE Trans. Inf. Theory, vol. 59, no. 2, pp , Feb 203.

21 New Proof of the SCPP m := mmpe 2 p (X, snr0,p)= kzk2 p + snr 0 for some 0,,, = m kzk 2 p snr 0m. Proof: mmpe p (X, snr,p)=mmpe p (X, snr0 +,p)

22 New Proof of the SCPP m := mmpe 2 p (X, snr0,p)= kzk2 p + snr 0 for some 0,,, = m kzk 2 p snr 0m. Proof: mmpe p (X, snr,p)=mmpe p (X, snr0 +,p) =mmpe p (X, snr0,p Y ), where Y = p X + Z

23 New Proof of the SCPP m := mmpe 2 p (X, snr0,p)= kzk2 p + snr 0 for some 0,,, = m kzk 2 p snr 0m. Proof: mmpe p (X, snr,p)=mmpe p (X, snr0 +,p) =mmpe p (X, snr0,p Y ), where Y = p X + Z = kx f p (X Y, Y snr0 )k p

24 New Proof of the SCPP m := mmpe 2 p (X, snr0,p)= kzk2 p + snr 0 for some 0,,, = m kzk 2 p snr 0m. Proof: mmpe p (X, snr,p)=mmpe p (X, snr0 +,p) =mmpe p (X, snr0,p Y ), where Y = p X + Z = kx f p (X Y, Y snr0 )k p applekx ˆXk p, where ˆX = ( ) p Y + f p (X Y snr0 ), 2 [0, ]

25 New Proof of the SCPP m := mmpe 2 p (X, snr0,p)= kzk2 p + snr 0 for some 0,,, = m kzk 2 p snr 0m. Proof: mmpe p (X, snr,p)=mmpe p (X, snr0 +,p) =mmpe p (X, snr0,p Y ), where Y = p X + Z = kx f p (X Y, Y snr0 )k p applekx ˆXk p, where ˆX = ( ) p Y + f p (X Y snr0 ), 2 [0, ] = (X f p (X Y snr0 )) ( ) p Z p

26 New Proof of the SCPP m := mmpe 2 p (X, snr0,p)= kzk2 p + snr 0 for some 0,,, = m kzk 2 p snr 0m. Proof: mmpe p (X, snr,p)=mmpe p (X, snr0 +,p) =mmpe p (X, snr0,p Y ), where Y = p X + Z = kx f p (X Y, Y snr0 )k p applekx ˆXk p, where ˆX = ( ) p Y + f p (X Y snr0 ), 2 [0, ] = (X f p (X Y snr0 )) = ( ) p Z p kzk 2 p(x f p (X Y snr0 )) m Z p kzk 2, choose = p + m p kzk 2 p kzk 2 p + m

27 New Proof of the SCPP m := mmpe 2 p (X, snr0,p)= kzk2 p + snr 0 for some 0,,, = m kzk 2 p snr 0m. Proof: mmpe p (X, snr,p)=mmpe p (X, snr0 +,p) =mmpe p (X, snr0,p Y ), where Y = p X + Z = kx f p (X Y, Y snr0 )k p applekx ˆXk p, where ˆX = ( ) p Y + f p (X Y snr0 ), 2 [0, ] = (X f p (X Y snr0 )) ( ) p Z p kzk 2 p(x f p (X Y snr0 )) m Z p kzk 2 p = kzk 2, choose = p + m kzk 2 p + p ( mkzkp 2 p, triangle inequality apple c p q,c p = kzk 2 p + m p =2, expand p m = c p kzk 2 p + snr

28 New Proof of the SCPP (SCPP Bound for MMPE.) Suppose mmpe 2 p (X, snr0,p)= 0. Then kzk2 p + snr 0 for some where c p = mmpe 2 p (X, snr,p) apple cp ( 2 p p =2. kzk 2 p + snr, for snr snr 0, Observations: Previous proof relied on the knowing the derivative of the MMSE New proof relies on simple estimation observations Extends to the MMPE for all p>

29 Connections to IT Bounds on Conditional Entropy h(u V) apple n 2 log(2 e mmse(u V)), Theorem. If kuk p < for some p 2 (0, ) then for any V 2 R n,wehave h(u V) apple n 2 log k 2 n,2p n p mmpe p (U V; p), where k n,p := p ( p n) p e p n ( n n ( n 2 + ) p + ) = p 2 e n 2 ( p 2 ) 2n + o n p. Sharper version of Ozarow-Wyner bound Bounds on the derivative of the MMSE

30 Ozarow-Wyner Bound (Ozarow-Wyner Bound.) Let X D be a discrete random variable then H(X D ) gap apple I(X D ; Y ) apple H(X D ), lmmse(x D Y )= d min (X D )= gap = 2 log e 6 E[X 2 D ] + snr E[X 2 D ] + 2 log + 2 lmmse(x D Y ) d 2 min (X D) inf x i x j. x i,x j 2supp(X D ):i6=j,

31 Generalized Ozarow-Wyner Bound Let X D be a discrete random vector, then for any p where [H(X D ) gap p ] + apple I(X D ; Y) apple H(X D ), gap p = inf gap(u), U2K ku + n XD f p (X D Y)k p gap(u) =log kuk p + log! k n,p n p kukp. e n h e(u) where K = {U : diam(supp(u)) apple 2 d min (X D )}. Moreover,! ku + XD f p (X D Y)k for p p log apple log + mmpe p (X, snr,p). kuk p kuk p

32 Take X D to be PAM with N = p + snr then 0.8 Generalized Ozarow-Wyner Bound 2 log( + snr) I(X D; Y ) = gap p=2 p=4 p=6 Original OW Shaping Loss gap SNRdB

33 Concluding Remarks Properties of the MMPE functional Conditional MMPE New Proof of the SCPP and its extension Bounds on differential entropy Generalized Ozarow-Wyner bound

34 A. Dytso, R. Bustin, D. Tuninetti, N. Devroye, S. Shamai, and H. V. Poor, On the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications, Submitted to IEEE Trans. Inf. Theory, Thank you arxiv:

35 Back Up Slides arxiv:

36 Problem W Encoder X n p snr Y n Decoder Ŵ p snr0 Z n Y n snr 0 Estimator ˆX Y n = p snrx n + Z n Z n 0 Y n snr 0 = p snr 0 X n + Z n 0 where Z n N (0, I),Z n 0 N (0, I) mmse(x n Y n snr 0 ) apple c max-i problem C n (snr, snr 0, s.t.tr E ):=sup X n n I(Xn,Ysnr), n h X n (X n ) Ti apple n, and mmse(x n, snr 0 ) apple + snr0.

37 max-mmse problem max-mmse problem M n (snr, snr 0, ):=supmmse(x, snr), X s.t. n Tr E[XXT ] apple, and mmse(x, snr 0 ) apple + snr0. Relationship to max-i problem snr 2 M n(snr, snr 0, ) apple C n (snr, snr 0, ) apple 2 Z snr 0 M n (t, snr 0, )dt.

On the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications

On the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications On the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications Alex Dytso, Ronit Bustin, Daniela Tuninetti, Natasha Devroye, H. Vincent Poor, and Shlomo Shamai (Shitz) Notation X 2 R n,

More information

On the Optimality of Treating Interference as Noise in Competitive Scenarios

On the Optimality of Treating Interference as Noise in Competitive Scenarios On the Optimality of Treating Interference as Noise in Competitive Scenarios A. DYTSO, D. TUNINETTI, N. DEVROYE WORK PARTIALLY FUNDED BY NSF UNDER AWARD 1017436 OUTLINE CHANNEL MODEL AND PAST WORK ADVANTAGES

More information

Coding over Interference Channels: An Information-Estimation View

Coding over Interference Channels: An Information-Estimation View Coding over Interference Channels: An Information-Estimation View Shlomo Shamai Department of Electrical Engineering Technion - Israel Institute of Technology Information Systems Laboratory Colloquium

More information

On the Secrecy Capacity of the Z-Interference Channel

On the Secrecy Capacity of the Z-Interference Channel On the Secrecy Capacity of the Z-Interference Channel Ronit Bustin Tel Aviv University Email: ronitbustin@post.tau.ac.il Mojtaba Vaezi Princeton University Email: mvaezi@princeton.edu Rafael F. Schaefer

More information

On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel

On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel Yair Carmon and Shlomo Shamai (Shitz) Department of Electrical Engineering, Technion - Israel Institute of Technology 2014

More information

Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error

Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error Dongning Guo, Yihong Wu, Shlomo Shamai (Shitz), and Sergio Verdú Abstract arxiv:1004.333v1 [cs.it] 0 Apr 010 Consider the minimum

More information

HetNets: what tools for analysis?

HetNets: what tools for analysis? HetNets: what tools for analysis? Daniela Tuninetti (Ph.D.) Email: danielat@uic.edu Motivation Seven Ways that HetNets are a Cellular Paradigm Shift, by J. Andrews, IEEE Communications Magazine, March

More information

On Gaussian Interference Channels with Mixed Gaussian and Discrete Inputs

On Gaussian Interference Channels with Mixed Gaussian and Discrete Inputs On Gaussian nterference Channels with Mixed Gaussian and Discrete nputs Alex Dytso Natasha Devroye and Daniela Tuninetti University of llinois at Chicago Chicago L 60607 USA Email: odytso devroye danielat

More information

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel Stefano Rini, Daniela Tuninetti, and Natasha Devroye Department

More information

i.i.d. Mixed Inputs and Treating Interference as Noise are gdof Optimal for the Symmetric Gaussian Two-user Interference Channel

i.i.d. Mixed Inputs and Treating Interference as Noise are gdof Optimal for the Symmetric Gaussian Two-user Interference Channel iid Mixed nputs and Treating nterference as Noise are gdof Optimal for the Symmetric Gaussian Two-user nterference Channel Alex Dytso Daniela Tuninetti and Natasha Devroye University of llinois at Chicago

More information

Information Dimension

Information Dimension Information Dimension Mina Karzand Massachusetts Institute of Technology November 16, 2011 1 / 26 2 / 26 Let X would be a real-valued random variable. For m N, the m point uniform quantized version of

More information

Interactions of Information Theory and Estimation in Single- and Multi-user Communications

Interactions of Information Theory and Estimation in Single- and Multi-user Communications Interactions of Information Theory and Estimation in Single- and Multi-user Communications Dongning Guo Department of Electrical Engineering Princeton University March 8, 2004 p 1 Dongning Guo Communications

More information

Upper and Lower Bounds on the Capacity of Amplitude-Constrained MIMO Channels

Upper and Lower Bounds on the Capacity of Amplitude-Constrained MIMO Channels Upper and Lower Bounds on the Capacity of Amplitude-Constrained MIMO Channels Alex Dytso, Mario Goldenbaum, Shlomo Shamai Shitz, and H. Vincent Poor Department of Electrical Engineering, Princeton University

More information

Functional Properties of MMSE

Functional Properties of MMSE Functional Properties of MMSE Yihong Wu epartment of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú epartment of Electrical Engineering

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract

More information

Capacity Bounds for Diamond Networks

Capacity Bounds for Diamond Networks Technische Universität München Capacity Bounds for Diamond Networks Gerhard Kramer (TUM) joint work with Shirin Saeedi Bidokhti (TUM & Stanford) DIMACS Workshop on Network Coding Rutgers University, NJ

More information

Signal Estimation in Gaussian Noise: A Statistical Physics Perspective

Signal Estimation in Gaussian Noise: A Statistical Physics Perspective Signal Estimation in Gaussian Noise: A Statistical Physics Perspective Neri Merhav Electrical Engineering Dept. Technion Israel Inst. of Tech. Haifa 3000, Israel Email: merhav@ee.technion.ac.il Dongning

More information

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes Shannon meets Wiener II: On MMSE estimation in successive decoding schemes G. David Forney, Jr. MIT Cambridge, MA 0239 USA forneyd@comcast.net Abstract We continue to discuss why MMSE estimation arises

More information

Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter

Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter Bradford D. Boyle, John MacLaren Walsh, and Steven Weber Modeling & Analysis of Networks Laboratory

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

The Capacity Region of the Gaussian MIMO Broadcast Channel

The Capacity Region of the Gaussian MIMO Broadcast Channel 0-0 The Capacity Region of the Gaussian MIMO Broadcast Channel Hanan Weingarten, Yossef Steinberg and Shlomo Shamai (Shitz) Outline Problem statement Background and preliminaries Capacity region of the

More information

Samah A. M. Ghanem, Member, IEEE, Abstract

Samah A. M. Ghanem, Member, IEEE, Abstract Multiple Access Gaussian Channels with Arbitrary Inputs: Optimal Precoding and Power Allocation Samah A. M. Ghanem, Member, IEEE, arxiv:4.0446v2 cs.it] 6 Nov 204 Abstract In this paper, we derive new closed-form

More information

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel

Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Primary Rate-Splitting Achieves Capacity for the Gaussian Cognitive Interference Channel Stefano Rini, Ernest Kurniawan and Andrea Goldsmith Technische Universität München, Munich, Germany, Stanford University,

More information

Some Results on the Generalized Gaussian Distribution

Some Results on the Generalized Gaussian Distribution Some Results on the Generalized Gaussian Distribution Alex Dytso, Ronit Bustin, H. Vincent Poor, Daniela Tuninetti 3, Natasha Devroye 3, and Shlomo Shamai (Shitz) Abstract The aer considers information-theoretic

More information

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications Professor M. Chiang Electrical Engineering Department, Princeton University March

More information

On Gaussian MIMO Broadcast Channels with Common and Private Messages

On Gaussian MIMO Broadcast Channels with Common and Private Messages On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu

More information

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 4, APRIL 2012 2427 An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel Ersen Ekrem, Student Member, IEEE,

More information

information estimation feedback

information estimation feedback relations between information and estimation in the presence of feedback talk at workshop on: Tsachy Weissman Information and Control in Networks LCCC - Lund Center for Control of Complex Engineering Systems

More information

MMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only

MMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only MMSE Dimension Yihong Wu Department of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú Department of Electrical Engineering Princeton University

More information

On the K-user Cognitive Interference Channel with Cumulative Message Sharing Sum-Capacity

On the K-user Cognitive Interference Channel with Cumulative Message Sharing Sum-Capacity 03 EEE nternational Symposium on nformation Theory On the K-user Cognitive nterference Channel with Cumulative Message Sharing Sum-Capacity Diana Maamari, Daniela Tuninetti and Natasha Devroye Department

More information

Limits on classical communication from quantum entropy power inequalities

Limits on classical communication from quantum entropy power inequalities Limits on classical communication from quantum entropy power inequalities Graeme Smith, IBM Research (joint work with Robert Koenig) QIP 2013 Beijing Channel Capacity X N Y p(y x) Capacity: bits per channel

More information

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015

Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015 The Individual Secrecy Capacity of Degraded Multi-Receiver Wiretap Broadcast Channels Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche Lehrstuhl für Technische Universität München, Germany Department

More information

An Outer Bound for the Gaussian. Interference channel with a relay.

An Outer Bound for the Gaussian. Interference channel with a relay. An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il

More information

Relations between Information and Estimation in the presence of Feedback

Relations between Information and Estimation in the presence of Feedback Chapter 1 Relations between Information and Estimation in the presence of Feedback Himanshu Asnani, Kartik Venkat and Tsachy Weissman Abstract We discuss some of the recent literature on relations between

More information

Soft Covering with High Probability

Soft Covering with High Probability Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 61, NO. 3, MARCH

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 61, NO. 3, MARCH IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 6, NO., MARCH 05 57 On the Two-User Interference Channel With Lack of Knowledge of the Interference Codebook at One Receiver Alex Dytso, Daniela Tuninetti,

More information

Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery

Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery Jonathan Scarlett jonathan.scarlett@epfl.ch Laboratory for Information and Inference Systems (LIONS)

More information

4 Linear Algebra Review

4 Linear Algebra Review Linear Algebra Review For this topic we quickly review many key aspects of linear algebra that will be necessary for the remainder of the text 1 Vectors and Matrices For the context of data analysis, the

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel Jun Chen Dept. of Electrical and Computer Engr. McMaster University Hamilton, Ontario, Canada Chao Tian AT&T Labs-Research 80 Park

More information

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University

More information

On the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels

On the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels On the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels Giuseppe Caire University of Southern California Los Angeles, CA, USA Email: caire@usc.edu Nihar

More information

On Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti

On Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti DIMACS Workshop on Network Information Theory - March 2003 Daniela Tuninetti 1 On Two-user Fading Gaussian Broadcast Channels with Perfect Channel State Information at the Receivers Daniela Tuninetti Mobile

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Shannon Meets Carnot: Mutual Information Via Thermodynamics

Shannon Meets Carnot: Mutual Information Via Thermodynamics Shannon Meets Carnot: 1 Mutual Information Via Thermodynamics Ori Shental and Ido Kanter Abstract In this contribution, the Gaussian channel is represented as an equivalent thermal system allowing to express

More information

SOME fundamental relationships between input output mutual

SOME fundamental relationships between input output mutual IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 5, MAY 2008 1837 Mutual Information and Conditional Mean Estimation in Poisson Channels Dongning Guo, Member, IEEE, Shlomo Shamai (Shitz), Fellow,

More information

On Linear Transforms in Zero-delay Gaussian Source Channel Coding

On Linear Transforms in Zero-delay Gaussian Source Channel Coding 2012 IEEE International Symposium on Information Theory Proceedings On Linear Transforms in ero-delay Gaussian Source hannel oding Emrah Akyol and enneth Rose eakyol, rose@eceucsbedu University of alifornia,

More information

Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels

Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels Yang Wen Liang Department of Electrical and Computer Engineering The University of British Columbia April 19th, 005 Outline of Presentation

More information

Kalman Filtering. Namrata Vaswani. March 29, Kalman Filter as a causal MMSE estimator

Kalman Filtering. Namrata Vaswani. March 29, Kalman Filter as a causal MMSE estimator Kalman Filtering Namrata Vaswani March 29, 2018 Notes are based on Vincent Poor s book. 1 Kalman Filter as a causal MMSE estimator Consider the following state space model (signal and observation model).

More information

much more on minimax (order bounds) cf. lecture by Iain Johnstone

much more on minimax (order bounds) cf. lecture by Iain Johnstone much more on minimax (order bounds) cf. lecture by Iain Johnstone http://www-stat.stanford.edu/~imj/wald/wald1web.pdf today s lecture parametric estimation, Fisher information, Cramer-Rao lower bound:

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

The Robustness of Dirty Paper Coding and The Binary Dirty Multiple Access Channel with Common Interference

The Robustness of Dirty Paper Coding and The Binary Dirty Multiple Access Channel with Common Interference The and The Binary Dirty Multiple Access Channel with Common Interference Dept. EE - Systems, Tel Aviv University, Tel Aviv, Israel April 25th, 2010 M.Sc. Presentation The B/G Model Compound CSI Smart

More information

Lecture 7 MIMO Communica2ons

Lecture 7 MIMO Communica2ons Wireless Communications Lecture 7 MIMO Communica2ons Prof. Chun-Hung Liu Dept. of Electrical and Computer Engineering National Chiao Tung University Fall 2014 1 Outline MIMO Communications (Chapter 10

More information

Optimal Power Control for LDPC Codes in Block-Fading Channels

Optimal Power Control for LDPC Codes in Block-Fading Channels IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 59, NO. 7, JULY 2011 1759 Optimal Power Control for LDPC Codes in Block-Fading Channels Gottfried Lechner, Khoa D. Nguyen, Albert Guillén i Fàbregas, and Lars

More information

Single-letter Characterization of Signal Estimation from Linear Measurements

Single-letter Characterization of Signal Estimation from Linear Measurements Single-letter Characterization of Signal Estimation from Linear Measurements Dongning Guo Dror Baron Shlomo Shamai The work has been supported by the European Commission in the framework of the FP7 Network

More information

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE

More information

Secrecy in the 2-User Symmetric Interference Channel with Transmitter Cooperation: Deterministic View

Secrecy in the 2-User Symmetric Interference Channel with Transmitter Cooperation: Deterministic View Secrecy in the 2-User Symmetric Interference Channel with Transmitter Cooperation: Deterministic View P. Mohapatra 9 th March 2013 Outline Motivation Problem statement Achievable scheme 1 Weak interference

More information

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC

More information

Capacity of Memoryless Channels and Block-Fading Channels With Designable Cardinality-Constrained Channel State Feedback

Capacity of Memoryless Channels and Block-Fading Channels With Designable Cardinality-Constrained Channel State Feedback 2038 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 9, SEPTEMBER 2004 Capacity of Memoryless Channels and Block-Fading Channels With Designable Cardinality-Constrained Channel State Feedback Vincent

More information

Lower Bounds and Approximations for the Information Rate of the ISI Channel

Lower Bounds and Approximations for the Information Rate of the ISI Channel Lower Bounds and Approximations for the Information Rate of the ISI Channel Yair Carmon and Shlomo Shamai Abstract arxiv:4.48v [cs.it] 7 Jan 4 We consider the discrete-time intersymbol interference ISI

More information

The Role of Directed Information in Network Capacity

The Role of Directed Information in Network Capacity The Role of Directed Information in Network Capacity Sudeep Kamath 1 and Young-Han Kim 2 1 Department of Electrical Engineering Princeton University 2 Department of Electrical and Computer Engineering

More information

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case 1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department

More information

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels Mehdi Mohseni Department of Electrical Engineering Stanford University Stanford, CA 94305, USA Email: mmohseni@stanford.edu

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157 Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x

More information

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 074 pritamm@umd.edu

More information

Analysis of coding on non-ergodic block-fading channels

Analysis of coding on non-ergodic block-fading channels Analysis of coding on non-ergodic block-fading channels Joseph J. Boutros ENST 46 Rue Barrault, Paris boutros@enst.fr Albert Guillén i Fàbregas Univ. of South Australia Mawson Lakes SA 5095 albert.guillen@unisa.edu.au

More information

On the Capacity of the Interference Channel with a Relay

On the Capacity of the Interference Channel with a Relay On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due

More information

On the Duality of Gaussian Multiple-Access and Broadcast Channels

On the Duality of Gaussian Multiple-Access and Broadcast Channels On the Duality of Gaussian ultiple-access and Broadcast Channels Xiaowei Jin I. INTODUCTION Although T. Cover has been pointed out in [] that one would have expected a duality between the broadcast channel(bc)

More information

Optimal Data and Training Symbol Ratio for Communication over Uncertain Channels

Optimal Data and Training Symbol Ratio for Communication over Uncertain Channels Optimal Data and Training Symbol Ratio for Communication over Uncertain Channels Ather Gattami Ericsson Research Stockholm, Sweden Email: athergattami@ericssoncom arxiv:50502997v [csit] 2 May 205 Abstract

More information

Wideband Fading Channel Capacity with Training and Partial Feedback

Wideband Fading Channel Capacity with Training and Partial Feedback Wideband Fading Channel Capacity with Training and Partial Feedback Manish Agarwal, Michael L. Honig ECE Department, Northwestern University 145 Sheridan Road, Evanston, IL 6008 USA {m-agarwal,mh}@northwestern.edu

More information

Convex functions. Definition. f : R n! R is convex if dom f is a convex set and. f ( x +(1 )y) < f (x)+(1 )f (y) f ( x +(1 )y) apple f (x)+(1 )f (y)

Convex functions. Definition. f : R n! R is convex if dom f is a convex set and. f ( x +(1 )y) < f (x)+(1 )f (y) f ( x +(1 )y) apple f (x)+(1 )f (y) Convex functions I basic properties and I operations that preserve convexity I quasiconvex functions I log-concave and log-convex functions IOE 611: Nonlinear Programming, Fall 2017 3. Convex functions

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

Degrees-of-Freedom Robust Transmission for the K-user Distributed Broadcast Channel

Degrees-of-Freedom Robust Transmission for the K-user Distributed Broadcast Channel /33 Degrees-of-Freedom Robust Transmission for the K-user Distributed Broadcast Channel Presented by Paul de Kerret Joint work with Antonio Bazco, Nicolas Gresset, and David Gesbert ESIT 2017 in Madrid,

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

Capacity of Block Rayleigh Fading Channels Without CSI

Capacity of Block Rayleigh Fading Channels Without CSI Capacity of Block Rayleigh Fading Channels Without CSI Mainak Chowdhury and Andrea Goldsmith, Fellow, IEEE Department of Electrical Engineering, Stanford University, USA Email: mainakch@stanford.edu, andrea@wsl.stanford.edu

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Computation of Information Rates from Finite-State Source/Channel Models

Computation of Information Rates from Finite-State Source/Channel Models Allerton 2002 Computation of Information Rates from Finite-State Source/Channel Models Dieter Arnold arnold@isi.ee.ethz.ch Hans-Andrea Loeliger loeliger@isi.ee.ethz.ch Pascal O. Vontobel vontobel@isi.ee.ethz.ch

More information

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe

Secret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK

More information

Convex Functions. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST)

Convex Functions. Daniel P. Palomar. Hong Kong University of Science and Technology (HKUST) Convex Functions Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2017-18, HKUST, Hong Kong Outline of Lecture Definition convex function Examples

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Multiple Antennas in Wireless Communications

Multiple Antennas in Wireless Communications Multiple Antennas in Wireless Communications Luca Sanguinetti Department of Information Engineering Pisa University luca.sanguinetti@iet.unipi.it April, 2009 Luca Sanguinetti (IET) MIMO April, 2009 1 /

More information

Information Theoretic Limits of Randomness Generation

Information Theoretic Limits of Randomness Generation Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication

More information

Entropy power inequality for a family of discrete random variables

Entropy power inequality for a family of discrete random variables 20 IEEE International Symposium on Information Theory Proceedings Entropy power inequality for a family of discrete random variables Naresh Sharma, Smarajit Das and Siddharth Muthurishnan School of Technology

More information

arxiv: v1 [cs.it] 4 Jun 2018

arxiv: v1 [cs.it] 4 Jun 2018 State-Dependent Interference Channel with Correlated States 1 Yunhao Sun, 2 Ruchen Duan, 3 Yingbin Liang, 4 Shlomo Shamai (Shitz) 5 Abstract arxiv:180600937v1 [csit] 4 Jun 2018 This paper investigates

More information

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada

More information

Lecture 6 Channel Coding over Continuous Channels

Lecture 6 Channel Coding over Continuous Channels Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have

More information

Lecture 4 Capacity of Wireless Channels

Lecture 4 Capacity of Wireless Channels Lecture 4 Capacity of Wireless Channels I-Hsiang Wang ihwang@ntu.edu.tw 3/0, 014 What we have learned So far: looked at specific schemes and techniques Lecture : point-to-point wireless channel - Diversity:

More information

Multipath Matching Pursuit

Multipath Matching Pursuit Multipath Matching Pursuit Submitted to IEEE trans. on Information theory Authors: S. Kwon, J. Wang, and B. Shim Presenter: Hwanchol Jang Multipath is investigated rather than a single path for a greedy

More information

Joint Multi-Cell Processing for Downlink Channels with Limited-Capacity Backhaul

Joint Multi-Cell Processing for Downlink Channels with Limited-Capacity Backhaul Joint Multi-Cell Processing for Downlink Channels with Limited-Capacity Backhaul Shlomo Shamai (Shitz) Department of Electrical Engineering Technion Haifa, 3000, Israel sshlomo@ee.technion.ac.il Osvaldo

More information

Interference Channel aided by an Infrastructure Relay

Interference Channel aided by an Infrastructure Relay Interference Channel aided by an Infrastructure Relay Onur Sahin, Osvaldo Simeone, and Elza Erkip *Department of Electrical and Computer Engineering, Polytechnic Institute of New York University, Department

More information

Multiple-Input Multiple-Output Systems

Multiple-Input Multiple-Output Systems Multiple-Input Multiple-Output Systems What is the best way to use antenna arrays? MIMO! This is a totally new approach ( paradigm ) to wireless communications, which has been discovered in 95-96. Performance

More information

Compressed Sensing under Optimal Quantization

Compressed Sensing under Optimal Quantization Compressed Sensing under Optimal Quantization Alon Kipnis, Galen Reeves, Yonina C. Eldar and Andrea J. Goldsmith Department of Electrical Engineering, Stanford University Department of Electrical and Computer

More information

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of

More information

Classical and Quantum Channel Simulations

Classical and Quantum Channel Simulations Classical and Quantum Channel Simulations Mario Berta (based on joint work with Fernando Brandão, Matthias Christandl, Renato Renner, Joseph Renes, Stephanie Wehner, Mark Wilde) Outline Classical Shannon

More information

Demixing Radio Waves in MIMO Spatial Multiplexing: Geometry-based Receivers Francisco A. T. B. N. Monteiro

Demixing Radio Waves in MIMO Spatial Multiplexing: Geometry-based Receivers Francisco A. T. B. N. Monteiro Demixing Radio Waves in MIMO Spatial Multiplexing: Geometry-based Receivers Francisco A. T. B. N. Monteiro 005, it - instituto de telecomunicações. Todos os direitos reservados. Demixing Radio Waves in

More information