information estimation feedback

Size: px
Start display at page:

Download "information estimation feedback"

Transcription

1 relations between information and estimation in the presence of feedback talk at workshop on: Tsachy Weissman Information and Control in Networks LCCC - Lund Center for Control of Complex Engineering Systems

2 information, control, networks real-time and limited delay communication feedback communications action in information theory relations between information and estimation (w. feedback + networks)

3 Outline relations between information and estimation the presence of feedback implications for networks

4 Haves and Have-Nots (in this talk) we ll have: some theorems cute (and meaningful) relations an algorithmic framework we won t have: account of related literature stipulations proofs algorithms data

5 de Bruijn s identity [A. J. Stam 1959]: X independent of Z N (, 1) d dt h X + tz = 1 2 J(X + tz)

6 GuoShamaiVerdu setting Y = X + W W is a standard Gaussian, independent of X I( )=I(X; Y ) mmse( )=E (X E[X Y ]) 2

7 [Guo, Shamai and Verdú 25]: d d I( )=1 2 mmse( )

8 GSV in continuous time dy t = X t dt + dw t, t T I( )=I(X T ; Y T ) T mmse( )=E (X t E[X t Y T ]) 2 dt

9 [Guo, Shamai and Verdú 25]:, [Zakai 25]: d d I( )=1 2 mmse( ) or in its integral version snr I(snr) = 1 2 mmse( )d

10 Duncan dy t = X t dt + dw t, t T W is standard white Gaussian noise, independent of X [Duncan 197]: I(X T ; Y T )= 1 2 E T (X t E[X t Y t ]) 2 dt

11 SNR in Duncan dy t = X t dt + dw t, t T I( )=I(X T ; Y T ) cmmse( )=E T [Duncan 197]: (X t E[X t Y t ]) 2 dt I( )= 2 cmmse( )

12 Recap [Duncan 197]: I( )= cmmse( ) 2 [Guo, Shamai and Verdú 25]:, [Zakai 25]: snr I(snr) = 1 2 mmse( )d?

13 Relationship between cmmse and mmse? [Guo, Shamai and Verdú 25]: cmmse(snr) = 1 snr snr mmse( )d

14 Mismatch Y = X + W W is a standard Gaussian, independent of X What if X P but the estimator thinks X Q? mse P,Q ( )=E P (X E Q [X Y ]) 2

15 A representation of relative entropy [Verdu 21]: D(P Q) = [mse P,Q ( ) mse P,P ( )]d D(P Ysnr Q Ysnr )= snr [mse P,Q ( ) mse P,P ( )]d

16 Causal vs. Non-causal Mismatched Estimation dy t = X t dt + dw t, t T W is standard white Gaussian noise, independent of X T cmse P,Q ( )=E P (X t E Q [X t Y t ]) 2 dt T mse P,Q ( )=E P (X t E Q [X t Y T ]) 2 dt

17 Causal vs. Non-causal Mismatched Estimation dy t = X t dt + dw t, t T W is standard white Gaussian noise, independent of X T cmse P,Q ( )=E P (X t E Q [X t Y t ]) 2 dt T mse P,Q ( )=E P (X t E Q [X t Y T ]) 2 dt Relationship between cmse P,Q and mse P,Q?

18 Relationship between cmse P,Q and mse P,Q [Weissman 21]: cmse P,Q (snr) = 1 snr snr mse P,Q ( )d

19 Relationship between cmse P,Q and mse P,Q [Weissman 21]: cmse P,Q (snr) = 1 snr snr mse P,Q ( )d = 2 snr [I(snr)+D (P Y T Q Y T )]

20 Implications and Applications many

21 Z Minimax (causal) Estimation minimax(p, snr) 4 = min max { ˆX t ( )} appletapplet P 2P ( E P " Z T `(X t, ˆX t (Y t ))dt # cmse P,P (snr) )

22 Z Minimax (causal) Estimation minimax(p, snr) 4 = min max { ˆX t ( )} appletapplet P 2P ( E P " Z T `(X t, ˆX t (Y t ))dt # cmse P,P (snr) ) classical

23 Z Minimax (causal) Estimation minimax(p, snr) 4 = min max { ˆX t ( )} appletapplet P 2P ( E P " Z T `(X t, ˆX t (Y t ))dt # cmse P,P (snr) ) classical ours

24 Z Minimax (causal) Estimation minimax(p, snr) 4 = min max { ˆX t ( )} appletapplet P 2P ( E P " Z T `(X t, ˆX t (Y t ))dt # cmse P,P (snr) ) classical ours Redundancy-Capacity theory

25 Z Minimax (causal) Estimation minimax(p, snr) 4 = min max { ˆX t ( )} appletapplet P 2P ( E P " Z T `(X t, ˆX t (Y t ))dt # cmse P,P (snr) ) classical ours Redundancy-Capacity theory Shannon

26 Z Minimax (causal) Estimation minimax(p, snr) 4 = min max { ˆX t ( )} appletapplet P 2P ( E P " Z T `(X t, ˆX t (Y t ))dt ( " Z # # ) cmse P,P (snr) classical minimax(p, snr) = min Q ours Redundancy-Capacity theory = 2 snr min Q = 2 snr max = 2 snr C Shannon max [cmse P,Q(snr) P 2P max P 2P D P Y T snr I ; Y T snr P Y T snr P 2P Q Y T snr cmse P,P (snr)] : is a P-valued RV

27 e strong d cap snr Q P 2P = ma2 2 snr redundancy-capacity 2 results are =directly applicable and im T here = : is a P-v max I ; Y T = : is a P-valuedsnrRV max I ; Ysnr snr 2 sn snr = C(P 2 2 snr = C(P, snr) T = C P Ysnr P strong Furthermore, the 2P snr redundancy-capacity snr Strong Converse Furthermore, the strong redundancy-capacity resu r {X, Furthermore, the strong strongredundancy-capacity redundancy-capacity results directly applicabl t ( )} t T of arehere ore, the strong redundancy-capacity results are directlyresult applicable and imply: [Merhavstrong and Feder 1995] applied here implies: 6.5 red cap T 6.5capstrong 6.5 red cap ng red t strong red cap (Xt, X t (Y ))dt cmsep,p (snr) (1 ) minimax(p, snr) EP >>{X and any filter { X ( )}, t t T > and t T any ( )}, ny filter {X t( )},filter t t T and any filter {X t ( )} t T, # "Z T T he possible exception of sources in a subset B P where T t EP t cmset,p,p (X X t(snr) (Y ))dt(1 `(Xt, X t (YE))dt P E T ") minimax(p, cmle (1 t ) min P,P (snr) tsnr) (X, X (Y ))dtcm nep Po «t, X tt (Y t))dt (X C PY T snr P P w (B) e 2 with for theall possible exception of sources in a subset B P where P P with the possible exception of sources in a, subset B P where exception of sources in npossible for all P P with the o for all P P with the possible exception C(P,snr) of sourc w (B) e 2 (B) e 2, minimax(p,snr) " C w (B) e 2 w being the capacity achieving prior capacity achieving prior ty achieving prior PY T w snr P 2P, w being the capacity achieving prior w being the capacity achieving prior , w (B) w (

28 Example Given: orthonormal signal set { i (t), t T } n i=1 X t = n B i i (t) i=1 P = laws P on X T : E P B 2 n and E P B n max I(X T ; Y T ) =?

29 Example (cont.) = = Y i = T i (t)dy t 1 i n are su cient statistics for Y T, I(X T ; Y T )=I(B n ; Y n ) max I(X T ; Y T ) = max I(B n ; Y n ) = max{i(b; Y ):B 2,P(B = ) (1 )} latter considered and numerically solved in: Lei Zhang and Dongning Guo, Capacity of Gaussian Channels with Duty Cycle and Power Constraints, IEEE Int. Symposium on Information Theory 211

30 Example (cont.) thus the minimax filter here is the Bayes filter assuming: X t = n i=1 B i i (t) where B i are iid according to the capacity achieving distribution of [Zhang and Guo, 211] cf. [Albert No + T.W., ISIT 213]...

31 (well) beyond Gaussian noise Poisson channel Lévy-type channels: Input-Output relationship expressed via Lévytype stochastic integral can obtain formulae via Lévy-Khintchine-type decompositions

32 ) X information control networks

33 The presence of Feedback

34 The presence of Feedback what of what we ve seen carries over to presence of feedback?

35 Duncan dy t = X t dt + dw t, t T W is standard white Gaussian noise, independent of X [Duncan 197]: I(X T ; Y T )= 1 2 E T (X t E[X t Y t ]) 2 dt Breaks down in presence of feedback!

36 cont time directed info [W., Permuter, Kim 212] I X T! Y T := inf I t X T! Y T t where I(X n! Y n ), nx i=1 I(X i ; Y i Y i 1 )

37 Duncan with feedback Theorem R { Let {(X t,b t )} T t= be adapted to the filtration {F t } T t=, where X T is a signal of finite average power R T E[X2 t ]dt < 1 and B T is a standard Brownian motion. Let Y T be the output of the AWGN channel whose input is X T and whose noise is driven by B T, i.e., dy t = X t dt + db t. Suppose that the regularity assumptions of Proposition 2 are satisfied for all <t<t. Then Z 1 2 Z T [W., Permuter, Kim 212] E (X t E[X t Y t ]) 2 dt = I(X T! Y T ) compare with [Kadota, Zakai, Ziv 1971]

38 GSV in continuous time dy t = X t dt + dw t, t T d d I( )=1 2 mmse( ) or in its integral version snr I(snr) = 1 2 mmse( )d

39 GSV in continuous time dy t = X t dt + dw t, t T d d I( )=1 2 mmse( ) or in its integral version snr I(snr) = 1 2 mmse( )d Breaks down in presence of feedback

40 GSV in continuous time with DI? I(X T Y T ) =? 1 2 snr No. In general I(X T Y T ) = 1 2 snr and so mmse( )d mmse( )d cmmse(snr) = 1 snr snr mmse( )d I.e., breakdown in presence of feedback

41 Mismatched setting a fortiori, in presence of feedback, in general cmse P,Q (snr) = 1 snr 2 snr mse P,Q ( )d

42 Mismatched setting a fortiori, in presence of feedback, in general cmse P,Q (snr) = 1 snr 2 snr mse P,Q ( )d end of story?

43 Mismatched setting (cont.) cmse P,Q cmse P,P = D(P Y T Q Y T ) holds with or without FB, appears in TW21 implicitly and explicitly in workshop book chapter [Asnani, Venkat, W. 212] (why?)

44 implications and apps minimax estimation setting carries over directed info maximization instead of mutual info but same idea similar extensions to the more general channels

45 ) X ) information X control networks

46 Distributed estimation (known source) Y 1, ˆX 1 (Y 1 ).. X P X known source network noise. Y i, ˆX i (Y i ). Y n, ˆX n (Y n )

47 Distributed estimation (known source) Y 1, ˆX 1 (Y 1 ).. X P X known source network noise. Y i, ˆX i (Y i ). Y n, ˆX n (Y n ) can (and should) be greedy!

48 Distributed estimation (source uncertainty) Y 1, ˆX 1 (Y 1 ).. X P X 2 P network Y i, ˆX i (Y i ) source uncertainty noise.. Y n, ˆX n (Y n )

49 Distributed estimation (source uncertainty) Y 1, ˆX 1 (Y 1 ).. X P X 2 P network Y i, ˆX i (Y i ) source uncertainty noise.. Y n, ˆX n (Y n ) should we be greedy?

50 Distributed estimation (source uncertainty) Y 1, ˆX 1 (Y 1 ).. X P X 2 P network Y i, ˆX i (Y i ) source uncertainty noise.. Y n, ˆX n (Y n ) should we be greedy? no! (in general)

51 Distributed estimation (source uncertainty) Y 1, ˆX 1 (Y 1 ).. X P X 2 P network Y i, ˆX i (Y i ) source uncertainty noise.. Y n, ˆX n (Y n ) should we be greedy? no! (in general) yes! (in causal estimation over Gaussian, Poisson, or general Levy-type noise) minimax estimation for each observation separately would be essentially optimal

52 ) X) information X) control X networks

53 conclusion relations between mutual information, relative entropy, and estimation findings of pure estimation theoretic significance allow the transfer of tools much carries over to presence of feedback implications for networks

much more on minimax (order bounds) cf. lecture by Iain Johnstone

much more on minimax (order bounds) cf. lecture by Iain Johnstone much more on minimax (order bounds) cf. lecture by Iain Johnstone http://www-stat.stanford.edu/~imj/wald/wald1web.pdf today s lecture parametric estimation, Fisher information, Cramer-Rao lower bound:

More information

ee378a spring 2013 April 1st intro lecture statistical signal processing/ inference, estimation, and information processing Monday, April 1, 13

ee378a spring 2013 April 1st intro lecture statistical signal processing/ inference, estimation, and information processing Monday, April 1, 13 ee378a statistical signal processing/ inference, estimation, and information processing spring 2013 April 1st intro lecture 1 what is statistical signal processing? anything & everything inference, estimation,

More information

Relations between Information and Estimation in the presence of Feedback

Relations between Information and Estimation in the presence of Feedback Chapter 1 Relations between Information and Estimation in the presence of Feedback Himanshu Asnani, Kartik Venkat and Tsachy Weissman Abstract We discuss some of the recent literature on relations between

More information

Interactions of Information Theory and Estimation in Single- and Multi-user Communications

Interactions of Information Theory and Estimation in Single- and Multi-user Communications Interactions of Information Theory and Estimation in Single- and Multi-user Communications Dongning Guo Department of Electrical Engineering Princeton University March 8, 2004 p 1 Dongning Guo Communications

More information

The Relationship Between Causal and Noncausal Mismatched Estimation in Continuous-Time AWGN Channels

The Relationship Between Causal and Noncausal Mismatched Estimation in Continuous-Time AWGN Channels 4256 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 9, SEPTEMBER 2010 The Relationship Between Causal Noncausal Mismatched Estimation in Continuous-Time AWGN Channels Abstract A continuous-time

More information

Information, Estimation, and Lookahead in the Gaussian channel

Information, Estimation, and Lookahead in the Gaussian channel Information, Estimation, and Lookahead in the Gaussian channel Kartik Venkat, Tsachy Weissman, Yair Carmon, Shlomo Shamai arxiv:32.267v [cs.it] 8 Feb 23 Abstract We consider mean squared estimation with

More information

On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel

On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel Yair Carmon and Shlomo Shamai (Shitz) Department of Electrical Engineering, Technion - Israel Institute of Technology 2014

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 8, AUGUST /$ IEEE

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 8, AUGUST /$ IEEE IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 8, AUGUST 2009 3613 Hessian and Concavity of Mutual Information, Differential Entropy, and Entropy Power in Linear Vector Gaussian Channels Miquel

More information

Directed Information and Causal Estimation in Continuous Time

Directed Information and Causal Estimation in Continuous Time Directed Information and Causal Estimation in Continuous Time Young-Han Kim University of California, San Diego La Jolla, CA, USA yhk@ucsd.edu Haim H. Permuter Ben Gurion University of the Negev Beer-Sheva,

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 8: Differential entropy Chapter 8 outline Motivation Definitions Relation to discrete entropy Joint and conditional differential entropy Relative entropy and mutual information Properties AEP for

More information

A General Formula for Compound Channel Capacity

A General Formula for Compound Channel Capacity A General Formula for Compound Channel Capacity Sergey Loyka, Charalambos D. Charalambous University of Ottawa, University of Cyprus ETH Zurich (May 2015), ISIT-15 1/32 Outline 1 Introduction 2 Channel

More information

Asymptotic Capacity Bounds for Magnetic Recording. Raman Venkataramani Seagate Technology (Joint work with Dieter Arnold)

Asymptotic Capacity Bounds for Magnetic Recording. Raman Venkataramani Seagate Technology (Joint work with Dieter Arnold) Asymptotic Capacity Bounds for Magnetic Recording Raman Venkataramani Seagate Technology (Joint work with Dieter Arnold) Outline Problem Statement Signal and Noise Models for Magnetic Recording Capacity

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information

The Role of Directed Information in Network Capacity

The Role of Directed Information in Network Capacity The Role of Directed Information in Network Capacity Sudeep Kamath 1 and Young-Han Kim 2 1 Department of Electrical Engineering Princeton University 2 Department of Electrical and Computer Engineering

More information

On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities

On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities Alex Dytso, Ronit Bustin, Daniela Tuninetti, Natasha Devroye, H. Vincent Poor, and Shlomo Shamai (Shitz) Outline Notation

More information

Ch. 8 Math Preliminaries for Lossy Coding. 8.5 Rate-Distortion Theory

Ch. 8 Math Preliminaries for Lossy Coding. 8.5 Rate-Distortion Theory Ch. 8 Math Preliminaries for Lossy Coding 8.5 Rate-Distortion Theory 1 Introduction Theory provide insight into the trade between Rate & Distortion This theory is needed to answer: What do typical R-D

More information

On the Optimality of Treating Interference as Noise in Competitive Scenarios

On the Optimality of Treating Interference as Noise in Competitive Scenarios On the Optimality of Treating Interference as Noise in Competitive Scenarios A. DYTSO, D. TUNINETTI, N. DEVROYE WORK PARTIALLY FUNDED BY NSF UNDER AWARD 1017436 OUTLINE CHANNEL MODEL AND PAST WORK ADVANTAGES

More information

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157 Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x

More information

The Minimax Noise Sensitivity in Compressed Sensing

The Minimax Noise Sensitivity in Compressed Sensing The Minimax Noise Sensitivity in Compressed Sensing Galen Reeves and avid onoho epartment of Statistics Stanford University Abstract Consider the compressed sensing problem of estimating an unknown k-sparse

More information

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

Can Feedback Increase the Capacity of the Energy Harvesting Channel? Can Feedback Increase the Capacity of the Energy Harvesting Channel? Dor Shaviv EE Dept., Stanford University shaviv@stanford.edu Ayfer Özgür EE Dept., Stanford University aozgur@stanford.edu Haim Permuter

More information

MMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only

MMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only MMSE Dimension Yihong Wu Department of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú Department of Electrical Engineering Princeton University

More information

THIS paper is centered around two basic quantities in information

THIS paper is centered around two basic quantities in information IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. 4, APRIL 2005 1261 Mutual Information and Minimum Mean-Square Error in Gaussian Channels Dongning Guo, Member, IEEE, Shlomo Shamai (Shitz), Fellow,

More information

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes Shannon meets Wiener II: On MMSE estimation in successive decoding schemes G. David Forney, Jr. MIT Cambridge, MA 0239 USA forneyd@comcast.net Abstract We continue to discuss why MMSE estimation arises

More information

Recent Results on Input-Constrained Erasure Channels

Recent Results on Input-Constrained Erasure Channels Recent Results on Input-Constrained Erasure Channels A Case Study for Markov Approximation July, 2017@NUS Memoryless Channels Memoryless Channels Channel transitions are characterized by time-invariant

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

Relative Entropy and Score Function: New Information Estimation Relationships through Arbitrary Additive Perturbation

Relative Entropy and Score Function: New Information Estimation Relationships through Arbitrary Additive Perturbation Relative Entropy an Score Function: New Information Estimation Relationships through Arbitrary Aitive Perturbation Dongning Guo Department of Electrical Engineering & Computer Science Northwestern University

More information

16.1 Bounding Capacity with Covering Number

16.1 Bounding Capacity with Covering Number ECE598: Information-theoretic methods in high-dimensional statistics Spring 206 Lecture 6: Upper Bounds for Density Estimation Lecturer: Yihong Wu Scribe: Yang Zhang, Apr, 206 So far we have been mostly

More information

SOME fundamental relationships between input output mutual

SOME fundamental relationships between input output mutual IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 5, MAY 2008 1837 Mutual Information and Conditional Mean Estimation in Poisson Channels Dongning Guo, Member, IEEE, Shlomo Shamai (Shitz), Fellow,

More information

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Sparse Regression Codes for Multi-terminal Source and Channel Coding Sparse Regression Codes for Multi-terminal Source and Channel Coding Ramji Venkataramanan Yale University Sekhar Tatikonda Allerton 2012 1 / 20 Compression with Side-Information X Encoder Rate R Decoder

More information

Stochastic contraction BACS Workshop Chamonix, January 14, 2008

Stochastic contraction BACS Workshop Chamonix, January 14, 2008 Stochastic contraction BACS Workshop Chamonix, January 14, 2008 Q.-C. Pham N. Tabareau J.-J. Slotine Q.-C. Pham, N. Tabareau, J.-J. Slotine () Stochastic contraction 1 / 19 Why stochastic contraction?

More information

Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter

Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter Bradford D. Boyle, John MacLaren Walsh, and Steven Weber Modeling & Analysis of Networks Laboratory

More information

The Game of Twenty Questions with noisy answers. Applications to Fast face detection, micro-surgical tool tracking and electron microscopy

The Game of Twenty Questions with noisy answers. Applications to Fast face detection, micro-surgical tool tracking and electron microscopy The Game of Twenty Questions with noisy answers. Applications to Fast face detection, micro-surgical tool tracking and electron microscopy Graduate Summer School: Computer Vision July 22 - August 9, 2013

More information

Fisher Information, Compound Poisson Approximation, and the Poisson Channel

Fisher Information, Compound Poisson Approximation, and the Poisson Channel Fisher Information, Compound Poisson Approximation, and the Poisson Channel Mokshay Madiman Department of Statistics Yale University New Haven CT, USA Email: mokshaymadiman@yaleedu Oliver Johnson Department

More information

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC

More information

Information Dimension

Information Dimension Information Dimension Mina Karzand Massachusetts Institute of Technology November 16, 2011 1 / 26 2 / 26 Let X would be a real-valued random variable. For m N, the m point uniform quantized version of

More information

Lecture 18: Gaussian Channel

Lecture 18: Gaussian Channel Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400

More information

Chapter 4: Continuous channel and its capacity

Chapter 4: Continuous channel and its capacity meghdadi@ensil.unilim.fr Reference : Elements of Information Theory by Cover and Thomas Continuous random variable Gaussian multivariate random variable AWGN Band limited channel Parallel channels Flat

More information

Stability and Sensitivity of the Capacity in Continuous Channels. Malcolm Egan

Stability and Sensitivity of the Capacity in Continuous Channels. Malcolm Egan Stability and Sensitivity of the Capacity in Continuous Channels Malcolm Egan Univ. Lyon, INSA Lyon, INRIA 2019 European School of Information Theory April 18, 2019 1 / 40 Capacity of Additive Noise Models

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Gaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26

Gaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26 Gaussian channel Information theory 2013, lecture 6 Jens Sjölund 8 May 2013 Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26 Outline 1 Definitions 2 The coding theorem for Gaussian channel 3 Bandlimited

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma

More information

Wideband Fading Channel Capacity with Training and Partial Feedback

Wideband Fading Channel Capacity with Training and Partial Feedback Wideband Fading Channel Capacity with Training and Partial Feedback Manish Agarwal, Michael L. Honig ECE Department, Northwestern University 145 Sheridan Road, Evanston, IL 6008 USA {m-agarwal,mh}@northwestern.edu

More information

Constellation Shaping for Communication Channels with Quantized Outputs

Constellation Shaping for Communication Channels with Quantized Outputs Constellation Shaping for Communication Channels with Quantized Outputs, Dr. Matthew C. Valenti and Xingyu Xiang Lane Department of Computer Science and Electrical Engineering West Virginia University

More information

Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing

Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing arxiv:092.4872v [cs.it] 24 Dec 2009 Haim H. Permuter, Young-Han Kim, and Tsachy Weissman Abstract We

More information

Anatoly Khina. Joint work with: Uri Erez, Ayal Hitron, Idan Livni TAU Yuval Kochman HUJI Gregory W. Wornell MIT

Anatoly Khina. Joint work with: Uri Erez, Ayal Hitron, Idan Livni TAU Yuval Kochman HUJI Gregory W. Wornell MIT Network Modulation: Transmission Technique for MIMO Networks Anatoly Khina Joint work with: Uri Erez, Ayal Hitron, Idan Livni TAU Yuval Kochman HUJI Gregory W. Wornell MIT ACC Workshop, Feder Family Award

More information

Mismatched Estimation in Large Linear Systems

Mismatched Estimation in Large Linear Systems Mismatched Estimation in Large Linear Systems Yanting Ma, Dror Baron, and Ahmad Beirami North Carolina State University Massachusetts Institute of Technology & Duke University Supported by NSF & ARO Motivation

More information

A tool oriented approach to network capacity. Ralf Koetter Michelle Effros Muriel Medard

A tool oriented approach to network capacity. Ralf Koetter Michelle Effros Muriel Medard A tool oriented approach to network capacity Ralf Koetter Michelle Effros Muriel Medard ralf.koetter@tum.de effros@caltech.edu medard@mit.edu The main theorem of NC [Ahlswede, Cai, Li Young, 2001] Links

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Sparse Superposition Codes for the Gaussian Channel

Sparse Superposition Codes for the Gaussian Channel Sparse Superposition Codes for the Gaussian Channel Florent Krzakala (LPS, Ecole Normale Supérieure, France) J. Barbier (ENS) arxiv:1403.8024 presented at ISIT 14 Long version in preparation Communication

More information

Constellation Shaping for Communication Channels with Quantized Outputs

Constellation Shaping for Communication Channels with Quantized Outputs Constellation Shaping for Communication Channels with Quantized Outputs Chandana Nannapaneni, Matthew C. Valenti, and Xingyu Xiang Lane Department of Computer Science and Electrical Engineering West Virginia

More information

Functional Properties of MMSE

Functional Properties of MMSE Functional Properties of MMSE Yihong Wu epartment of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú epartment of Electrical Engineering

More information

Mismatched Estimation in Large Linear Systems

Mismatched Estimation in Large Linear Systems Mismatched Estimation in Large Linear Systems Yanting Ma, Dror Baron, Ahmad Beirami Department of Electrical and Computer Engineering, North Carolina State University, Raleigh, NC 7695, USA Department

More information

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5 UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, 2017 Solutions to Homework Set #5 3.18 Bounds on the quadratic rate distortion function. Recall that R(D) = inf F(ˆx x):e(x ˆX)2 DI(X; ˆX).

More information

The Capacity Region of the Gaussian MIMO Broadcast Channel

The Capacity Region of the Gaussian MIMO Broadcast Channel 0-0 The Capacity Region of the Gaussian MIMO Broadcast Channel Hanan Weingarten, Yossef Steinberg and Shlomo Shamai (Shitz) Outline Problem statement Background and preliminaries Capacity region of the

More information

Estimation-Theoretic Representation of Mutual Information

Estimation-Theoretic Representation of Mutual Information Estimation-Theoretic Representation of Mutual Information Daniel P. Palomar and Sergio Verdú Department of Electrical Engineering Princeton University Engineering Quadrangle, Princeton, NJ 08544, USA {danielp,verdu}@princeton.edu

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Cooperative Communication with Feedback via Stochastic Approximation

Cooperative Communication with Feedback via Stochastic Approximation Cooperative Communication with Feedback via Stochastic Approximation Utsaw Kumar J Nicholas Laneman and Vijay Gupta Department of Electrical Engineering University of Notre Dame Email: {ukumar jnl vgupta}@ndedu

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 15: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 29 th, 2015 1 Example: Channel Capacity of BSC o Let then: o For

More information

On the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications

On the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications On the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications Alex Dytso, Ronit Bustin, Daniela Tuninetti, Natasha Devroye, H. Vincent Poor, and Shlomo Shamai (Shitz) Notation X 2 R n,

More information

NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS

NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS Justin Dauwels Dept. of Information Technology and Electrical Engineering ETH, CH-8092 Zürich, Switzerland dauwels@isi.ee.ethz.ch

More information

Information Theoretic Limits of Randomness Generation

Information Theoretic Limits of Randomness Generation Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication

More information

Kyle Reing University of Southern California April 18, 2018

Kyle Reing University of Southern California April 18, 2018 Renormalization Group and Information Theory Kyle Reing University of Southern California April 18, 2018 Overview Renormalization Group Overview Information Theoretic Preliminaries Real Space Mutual Information

More information

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability

More information

Review of Quantization. Quantization. Bring in Probability Distribution. L-level Quantization. Uniform partition

Review of Quantization. Quantization. Bring in Probability Distribution. L-level Quantization. Uniform partition Review of Quantization UMCP ENEE631 Slides (created by M.Wu 004) Quantization UMCP ENEE631 Slides (created by M.Wu 001/004) L-level Quantization Minimize errors for this lossy process What L values to

More information

MA 8101 Stokastiske metoder i systemteori

MA 8101 Stokastiske metoder i systemteori MA 811 Stokastiske metoder i systemteori AUTUMN TRM 3 Suggested solution with some extra comments The exam had a list of useful formulae attached. This list has been added here as well. 1 Problem In this

More information

Downlink Multi-User MIMO for IEEE m

Downlink Multi-User MIMO for IEEE m Downlink Multi-User MIMO for IEEE 80216m Sivakishore Reddy Naga Sekhar Centre of Excellence in Wireless Technology 2013 Outline 1 Introduction 2 Closed Loop MU-MIMO 3 Results 4 Open Loop MU-MIMO 5 Results

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

Signal Estimation in Gaussian Noise: A Statistical Physics Perspective

Signal Estimation in Gaussian Noise: A Statistical Physics Perspective Signal Estimation in Gaussian Noise: A Statistical Physics Perspective Neri Merhav Electrical Engineering Dept. Technion Israel Inst. of Tech. Haifa 3000, Israel Email: merhav@ee.technion.ac.il Dongning

More information

One Lesson of Information Theory

One Lesson of Information Theory Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

Lecture 4: State Estimation in Hidden Markov Models (cont.)

Lecture 4: State Estimation in Hidden Markov Models (cont.) EE378A Statistical Signal Processing Lecture 4-04/13/2017 Lecture 4: State Estimation in Hidden Markov Models (cont.) Lecturer: Tsachy Weissman Scribe: David Wugofski In this lecture we build on previous

More information

CS 630 Basic Probability and Information Theory. Tim Campbell

CS 630 Basic Probability and Information Theory. Tim Campbell CS 630 Basic Probability and Information Theory Tim Campbell 21 January 2003 Probability Theory Probability Theory is the study of how best to predict outcomes of events. An experiment (or trial or event)

More information

The Wiener Itô Chaos Expansion

The Wiener Itô Chaos Expansion 1 The Wiener Itô Chaos Expansion The celebrated Wiener Itô chaos expansion is fundamental in stochastic analysis. In particular, it plays a crucial role in the Malliavin calculus as it is presented in

More information

Lecture 10 Linear Quadratic Stochastic Control with Partial State Observation

Lecture 10 Linear Quadratic Stochastic Control with Partial State Observation EE363 Winter 2008-09 Lecture 10 Linear Quadratic Stochastic Control with Partial State Observation partially observed linear-quadratic stochastic control problem estimation-control separation principle

More information

On the Complete Monotonicity of Heat Equation

On the Complete Monotonicity of Heat Equation [Pop-up Salon, Maths, SJTU, 2019/01/10] On the Complete Monotonicity of Heat Equation Fan Cheng John Hopcroft Center Computer Science and Engineering Shanghai Jiao Tong University chengfan@sjtu.edu.cn

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Outline Digital Communications. Lecture 02 Signal-Space Representation. Energy Signals. MMSE Approximation. Pierluigi SALVO ROSSI

Outline Digital Communications. Lecture 02 Signal-Space Representation. Energy Signals. MMSE Approximation. Pierluigi SALVO ROSSI Outline igital Communications Lecture 2 Signal-Space Representation Pierluigi SALVO ROSSI epartment of Industrial and Information Engineering Second University of Naples Via Roma 29, 8131 Aversa (CE, Italy

More information

Classical and Quantum Channel Simulations

Classical and Quantum Channel Simulations Classical and Quantum Channel Simulations Mario Berta (based on joint work with Fernando Brandão, Matthias Christandl, Renato Renner, Joseph Renes, Stephanie Wehner, Mark Wilde) Outline Classical Shannon

More information

A concavity property for the reciprocal of Fisher information and its consequences on Costa s EPI

A concavity property for the reciprocal of Fisher information and its consequences on Costa s EPI A concavity property for the reciprocal of Fisher information and its consequences on Costa s EPI Giuseppe Toscani October 0, 204 Abstract. We prove that the reciprocal of Fisher information of a logconcave

More information

Analytical Bounds on Maximum-Likelihood Decoded Linear Codes: An Overview

Analytical Bounds on Maximum-Likelihood Decoded Linear Codes: An Overview Analytical Bounds on Maximum-Likelihood Decoded Linear Codes: An Overview Igal Sason Department of Electrical Engineering, Technion Haifa 32000, Israel Sason@ee.technion.ac.il December 21, 2004 Background

More information

Finding the best mismatched detector for channel coding and hypothesis testing

Finding the best mismatched detector for channel coding and hypothesis testing Finding the best mismatched detector for channel coding and hypothesis testing Sean Meyn Department of Electrical and Computer Engineering University of Illinois and the Coordinated Science Laboratory

More information

A New Tight Upper Bound on the Entropy of Sums

A New Tight Upper Bound on the Entropy of Sums Article A New Tight Upper Bound on the Entropy of Sums Jihad Fahs * and Ibrahim Abou-Faycal eceived: 18 August 015; Accepted: 14 December 015; Published: 19 December 015 Academic Editor: aúl Alcaraz Martínez

More information

Samah A. M. Ghanem, Member, IEEE, Abstract

Samah A. M. Ghanem, Member, IEEE, Abstract Multiple Access Gaussian Channels with Arbitrary Inputs: Optimal Precoding and Power Allocation Samah A. M. Ghanem, Member, IEEE, arxiv:4.0446v2 cs.it] 6 Nov 204 Abstract In this paper, we derive new closed-form

More information

Revision of Lecture 4

Revision of Lecture 4 Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical

More information

Joint Channel Estimation and Co-Channel Interference Mitigation in Wireless Networks Using Belief Propagation

Joint Channel Estimation and Co-Channel Interference Mitigation in Wireless Networks Using Belief Propagation Joint Channel Estimation and Co-Channel Interference Mitigation in Wireless Networks Using Belief Propagation Yan Zhu, Dongning Guo and Michael L. Honig Northwestern University May. 21, 2008 Y. Zhu, D.

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Parameter Estimation

Parameter Estimation 1 / 44 Parameter Estimation Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay October 25, 2012 Motivation System Model used to Derive

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

A COLLABORATIVE 20 QUESTIONS MODEL FOR TARGET SEARCH WITH HUMAN-MACHINE INTERACTION

A COLLABORATIVE 20 QUESTIONS MODEL FOR TARGET SEARCH WITH HUMAN-MACHINE INTERACTION A COLLABORATIVE 20 QUESTIONS MODEL FOR TARGET SEARCH WITH HUMAN-MACHINE INTERACTION Theodoros Tsiligkaridis, Brian M Sadler and Alfred O Hero III, University of Michigan, EECS Dept and Dept Statistics,

More information

The Central Limit Theorem: More of the Story

The Central Limit Theorem: More of the Story The Central Limit Theorem: More of the Story Steven Janke November 2015 Steven Janke (Seminar) The Central Limit Theorem:More of the Story November 2015 1 / 33 Central Limit Theorem Theorem (Central Limit

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Logistic Regression. William Cohen

Logistic Regression. William Cohen Logistic Regression William Cohen 1 Outline Quick review classi5ication, naïve Bayes, perceptrons new result for naïve Bayes Learning as optimization Logistic regression via gradient ascent Over5itting

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

Shannon and Poisson. sergio verdú

Shannon and Poisson. sergio verdú Shannon and Poisson sergio verdú P λ (k) = e λλk k! deaths from horse kicks in the Prussian cavalry. photons arriving at photodetector packets arriving at a router DNA mutations Poisson entropy 3.5 3.0

More information

On Linear Transforms in Zero-delay Gaussian Source Channel Coding

On Linear Transforms in Zero-delay Gaussian Source Channel Coding 2012 IEEE International Symposium on Information Theory Proceedings On Linear Transforms in ero-delay Gaussian Source hannel oding Emrah Akyol and enneth Rose eakyol, rose@eceucsbedu University of alifornia,

More information

Approximate Message Passing Algorithms

Approximate Message Passing Algorithms November 4, 2017 Outline AMP (Donoho et al., 2009, 2010a) Motivations Derivations from a message-passing perspective Limitations Extensions Generalized Approximate Message Passing (GAMP) (Rangan, 2011)

More information