On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel

Size: px
Start display at page:

Download "On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel"

Transcription

1 On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel Yair Carmon and Shlomo Shamai (Shitz) Department of Electrical Engineering, Technion - Israel Institute of Technology 2014 Information Theory and Applications Workshop San Diego, USA February 2014 Acknowledgment: Prof. Tsachy Weissman, FP7 Network of Excellence in Wireless COMmunications NEWCOM#, Israel Science Foundation (ISF). Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

2 Outline 1 Introduction ISI Channel and I.I.D. Information Rate Analytical Lower Bounds on the Information Rate The Shamai-Laroia Conjecture (SLC) 2 Low SNR Analysis 3 Counterexamples 4 High SNR Analysis 5 Conclusion Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

3 Introduction ISI Channel and I.I.D. Information Rate Inter-Symbol Interference Channel Model Input-output relationship: L 1 y k = h l x k l + n k l=0 Input Sequence x is assumed i.i.d, with P x = Ex 2 0 Noise Sequence n is Gaussian and i.i.d., with N 0 = En 2 0 Inter-symbol interference (ISI) coefficients h L 1 0 Channel frequency response H (θ) = L 1 k=0 h ke jkθ The simplest (non-discrete) model for a channel with memory Ubiquitous in wireless and wireline communications Results presented here extend straightforwardly to a complex setting Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

4 Introduction ISI Channel and I.I.D. Information Rate Inter-Symbol Interference Channel Model Input-output relationship: L 1 y k = h l x k l + n k l=0 Input Sequence x is assumed i.i.d, with P x = Ex 2 0 Noise Sequence n is Gaussian and i.i.d., with N 0 = En 2 0 Inter-symbol interference (ISI) coefficients h L 1 0 Channel frequency response H (θ) = L 1 k=0 h ke jkθ The simplest (non-discrete) model for a channel with memory Ubiquitous in wireless and wireline communications Results presented here extend straightforwardly to a complex setting Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

5 Mutual Information Rate Introduction ISI Channel and I.I.D. Information Rate Given by 1 I = lim K 2K + 1 I(yK K ; xk K ) = I(y ; x 0 x 1 ) Is the rate of reliable communications achievable by a random code with codewords distributed as x For Gaussian input I has a simple expression I Gaussian = 1 2π π π ( log 1 + P ) x H(θ) 2 dθ N 0 When the input is distributed on a finite set (constellation), no closed form expression is known Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

6 Introduction ISI Channel and I.I.D. Information Rate Approximating the I.I.D. Information Rate Two main ways to investigate I: 1 Approximations and bounds based on Monte-Carlo simulations Provide the best accuracy But little theoretic insight High computational complexity, that grows quickly with the number of dominant ISI taps c.f. [Arnold-Loeliger-Vontobel-Kavcic-Zeng 06] 2 Analytical lower bounds Not as tight as their simulation-based counterparts But much easier to compute Useful in benchmarking communication schemes and other bounds May provide theoretical insight Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

7 Introduction ISI Channel and I.I.D. Information Rate Approximating the I.I.D. Information Rate Two main ways to investigate I: 1 Approximations and bounds based on Monte-Carlo simulations Provide the best accuracy But little theoretic insight High computational complexity, that grows quickly with the number of dominant ISI taps c.f. [Arnold-Loeliger-Vontobel-Kavcic-Zeng 06] 2 Analytical lower bounds Not as tight as their simulation-based counterparts But much easier to compute Useful in benchmarking communication schemes and other bounds May provide theoretical insight Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

8 Introduction Data Processing Lower Bounds Analytical Lower Bounds on the Information Rate For any sequence of coefficients a, I = I(y ; x 0 x 1 ) I( k a ky k ; x 0 x 1 ) Can be simplified to, I I(x 0 ; x 0 + k 1 α kx k + m) }{{} additive noise term with α k = l a lh l k / l a lh l and m N (0, N 0 l a2 l /( l a lh l ) 2 ) independent of x 0 Different choices of a yield different bounds Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

9 Introduction Analytical Lower Bounds on the Information Rate Data Processing with a Sample Whitened Matched Filter a are chosen so that α k = 0 for every k > 0 (non-causal ISI eliminated) In this case the noise term is purely Gaussian The resulting bound was first proposed in [Shamai-Ozarow-Wyner 91]: I I(x 0 ; x 0 + m) = I x (SNR ZF-DFE ) with I x (γ) the MI of a scalar Gaussian channel at SNR γ with input x 0 SNR ZF-DFE the output SNR of the zero-forcing decision feedback equalizer (DFE): SNR ZF-DFE = P { x 1 π ( exp log H(θ) 2) } dθ N 0 2π π Very simple, but quite loose in medium and low SNR s Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

10 Introduction Analytical Lower Bounds on the Information Rate Data Processing with a Mean Square WMF Choose a so that the noise term has minimum variance I I(x 0 ; x 0 + k 1 ˆα kx k + ˆm ) I MMSE }{{} min variance A tight bound in many cases Still difficult to compute and analyze Some techniques for further bounding were proposed Using probability-of-error bounds and Fano s inequality [Shamai-Laroia 96] Using a mismatched mutual information approach [Jeong-Moon 12] However, none of the resulting bounds is both simple and tight Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

11 Introduction Analytical Lower Bounds on the Information Rate Data Processing with a Mean Square WMF Choose a so that the noise term has minimum variance I I(x 0 ; x 0 + k 1 ˆα kx k + ˆm ) I MMSE }{{} min variance A tight bound in many cases Still difficult to compute and analyze Some techniques for further bounding were proposed Using probability-of-error bounds and Fano s inequality [Shamai-Laroia 96] Using a mismatched mutual information approach [Jeong-Moon 12] However, none of the resulting bounds is both simple and tight Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

12 Introduction The Shamai-Laroia Conjecture The Shamai-Laroia Conjecture (SLC) [Shamai-Laroia 96] conjectured that I MMSE is lower bounded by replacing x 1 with g 1, i.i.d. Gaussian of equal variance: I MMSE = I(x 0 ; x 0 + k 1 ˆα kx k + ˆm) I(x 0 ; x 0 + k 1 ˆα kg k + ˆm) = I x (SNR MMSE-DFE-U ) I SL SNR MMSE-DFE-U is the output SNR of the unbiased MMSE DFE: { 1 π ( SNR MMSE-DFE-U = exp log 1 + P ) } x H(θ) 2 dθ 1 2π N 0 I SL a simple, tight and useful approximation for I MMSE, I π Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

13 Introduction The Shamai-Laroia Conjecture (SLC) The Shamai-Laroia Conjecture Example BPSK input, h = [0.408, 0.817, 0.408] (moderate ISI severity) IMMSE ISLC Gaussian upper bound Shamai-Ozarow-Wyner lower bound Information [bits] Px/N0 [db] Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

14 Introduction The Shamai-Laroia Conjecture (SLC) The Shamai-Laroia Conjecture Example, cont BPSK input, h = [0.408, 0.817, 0.408] (moderate ISI severity) Information [bits] Information [bits] Px/N0 [db] Px/N0 [db] 1.2 IMMSE ISLC Gaussian upper bound Shamai-Ozarow-Wyner lower bound 1 Y. Carmon and S. Shamai 0.8 On the Shamai-Laroia Approximation ITA / 31

15 Introduction The Strong SLC and its Refutation The Shamai-Laroia Conjecture (SLC) A stronger version of the SLC reads I(x 0 ; x 0 + k 1 α kx k + m) I(x 0 ; x 0 + k 1 α kg k + m) for every α and m (not just ˆα and ˆm) [Abbe-Zheng 12] gave a counterexample Based on a geometrical tool using the Hermite polynomials Cannot straightforwardly refute the original SLC Uses continuous-valued input distributions (finite alphabet is more interesting) Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

16 Introduction The Strong SLC and its Refutation The Shamai-Laroia Conjecture (SLC) A stronger version of the SLC reads I(x 0 ; x 0 + k 1 α kx k + m) I(x 0 ; x 0 + k 1 α kg k + m) for every α and m (not just ˆα and ˆm) [Abbe-Zheng 12] gave a counterexample Based on a geometrical tool using the Hermite polynomials Cannot straightforwardly refute the original SLC Uses continuous-valued input distributions (finite alphabet is more interesting) Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

17 Outline of Results Introduction The Shamai-Laroia Conjecture (SLC) We present the following, Low SNR Analysis For sufficiently low SNR, I SL > I MMSE (essentially always), disproving the original SLC Counterexample Carefully constructed clearly disprove the SLC Moreover, I SL > I is shown (in a pathological case) High SNR Analysis For sufficiently high SNR, I > I SL (for finite entropy source) Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

18 Low SNR Universal Low SNR Behavior Define the normalized comulants of our (zero-mean) input: Skewness s x = Ex 3 0/[Ex 2 0] 3/2 Excess Kurtosis κ x = Ex 4 0/(Ex 2 0) 2 3 For s x 0, I MMSE I SL = 1 6 C 3s 2 xɛ 3 + O(ɛ 4 ) For s x = 0, I MMSE I SL = 1 24 C 4κ 2 xɛ 4 + O(ɛ 5 ) where k 1 ˆα kx k + ˆm the minimum variance noise term C m = k 1 ˆαm k / ( k 0 ˆα2 k ) m ɛ = k 0 ˆα2 k P x/e ˆm 2 the smallness parameter ɛ 0 as P x /N 0 0 Hence, I MMSE < I SL for sufficiently low P x /N 0 Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

19 Low SNR Universal Low SNR Behavior Define the normalized comulants of our (zero-mean) input: Skewness s x = Ex 3 0/[Ex 2 0] 3/2 Excess Kurtosis κ x = Ex 4 0/(Ex 2 0) 2 3 For s x 0, I MMSE I SL = 1 6 C 3s 2 xɛ 3 + O(ɛ 4 ) For s x = 0, I MMSE I SL = 1 24 C 4κ 2 xɛ 4 + O(ɛ 5 ) where k 1 ˆα kx k + ˆm the minimum variance noise term C m = k 1 ˆαm k / ( k 0 ˆα2 k ) m ɛ = k 0 ˆα2 k P x/e ˆm 2 the smallness parameter ɛ 0 as P x /N 0 0 Hence, I MMSE < I SL for sufficiently low P x /N 0 Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

20 Low SNR Universal Low SNR Behavior Define the normalized comulants of our (zero-mean) input: Skewness s x = Ex 3 0/[Ex 2 0] 3/2 Excess Kurtosis κ x = Ex 4 0/(Ex 2 0) 2 3 For s x 0, I MMSE I SL = 1 6 C 3s 2 xɛ 3 + O(ɛ 4 ) For s x = 0, I MMSE I SL = 1 24 C 4κ 2 xɛ 4 + O(ɛ 5 ) where k 1 ˆα kx k + ˆm the minimum variance noise term C m = k 1 ˆαm k / ( k 0 ˆα2 k ) m ɛ = k 0 ˆα2 k P x/e ˆm 2 the smallness parameter ɛ 0 as P x /N 0 0 Hence, I MMSE < I SL for sufficiently low P x /N 0 Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

21 Low SNR Universal Low SNR Behavior Define the normalized comulants of our (zero-mean) input: Skewness s x = Ex 3 0/[Ex 2 0] 3/2 Excess Kurtosis κ x = Ex 4 0/(Ex 2 0) 2 3 For s x 0, I MMSE I SL = 1 6 C 3s 2 xɛ 3 + O(ɛ 4 ) For s x = 0, I MMSE I SL = 1 24 C 4κ 2 xɛ 4 + O(ɛ 5 ) where k 1 ˆα kx k + ˆm the minimum variance noise term C m = k 1 ˆαm k / ( k 0 ˆα2 k ) m ɛ = k 0 ˆα2 k P x/e ˆm 2 the smallness parameter ɛ 0 as P x /N 0 0 Hence, I MMSE < I SL for sufficiently low P x /N 0 Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

22 Outline of Proof Low SNR Using the chain rule, rewrite I MMSE as I MMSE = I 0 MMSE I1 MMSE, where I i MMSE I( k i ˆα kx k ; k i ˆα kx k + ˆm) and α 0 1 By [Guo-Wu-Shamai-Verdú 11], for any RV ξ and independent Gaussian ν, [ ] I(ξ ; ξ + ν) = ρ 2 ρ2 4 + ρ3 2 s2 ξ ρ4 [ κ ξ 12s 2 ξ + 6] + O ( ρ 5) with ρ = Eξ 2 /Eν 2 Apply the above series expansion to I 0 MMSE, I1 MMSE and I SL to obtain the desired result Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

23 Outline of Proof Low SNR Using the chain rule, rewrite I MMSE as I MMSE = I 0 MMSE I1 MMSE, where I i MMSE I( k i ˆα kx k ; k i ˆα kx k + ˆm) and α 0 1 By [Guo-Wu-Shamai-Verdú 11], for any RV ξ and independent Gaussian ν, [ ] I(ξ ; ξ + ν) = ρ 2 ρ2 4 + ρ3 2 s2 ξ ρ4 [ κ ξ 12s 2 ξ + 6] + O ( ρ 5) with ρ = Eξ 2 /Eν 2 Apply the above series expansion to I 0 MMSE, I1 MMSE and I SL to obtain the desired result Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

24 Counterexamples Counterexample Settings Inter-symbol interference Same as in the previous example (Moderate ISI severity) A three taps impulse response, h 0 = h 2 = 0.408, h 1 = Channel B from Chapter 10 of [Proakis 87] Input distribution 1 Symmetric trinary alphabet { 1, 0, 1} Pr(x = 1) = Pr(x = 1) = 0.01, Pr(x = 0) = 0.98 Zero skewness, high excess kurtosis: κ x = 47 Input distribution 2 Highly skewed binary distribution Pr(x > 0) = 1 Pr(x < 0) = s x 22.3, κ x 495 Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

25 Counterexamples Counterexample Settings Inter-symbol interference Same as in the previous example (Moderate ISI severity) A three taps impulse response, h 0 = h 2 = 0.408, h 1 = Channel B from Chapter 10 of [Proakis 87] Input distribution 1 Symmetric trinary alphabet { 1, 0, 1} Pr(x = 1) = Pr(x = 1) = 0.01, Pr(x = 0) = 0.98 Zero skewness, high excess kurtosis: κ x = 47 Input distribution 2 Highly skewed binary distribution Pr(x > 0) = 1 Pr(x < 0) = s x 22.3, κ x 495 Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

26 Counterexamples Demonstration of Low SNR Behaviour Numerical computation agrees with low-snr expansion for up to ɛ 0.02 At low SNR s and moderate to high ISI, I SL I MMSE 10 8 s 2 x κ 2 x Hence, s x 1 or κ x 1 is a must Difference in information [bits] 2 Explains why similar low-snr 4 counterexamples were not previously reported 6 0 x 10 6 Symmetric Trinary input IMMSE ISL Low SNR approximation ɛ (right scale) Px/N0 [db] Difference in information [bits] Difference in information [bits] x 10 6 Symmetric Trinary input IMMSE ISL 0.01 Low SNR approximation ɛ (right scale) Px/N0 [db] x 10 4 Skewed Binary Input Px/N0 [db] Difference in information [bits] Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

27 Counterexamples Demonstration of Low SNR Behaviour Numerical computation agrees with low-snr expansion for up to ɛ 0.02 At low SNR s and moderate to high ISI, I SL I MMSE 10 8 s 2 x κ 2 x Hence, s x 1 or κ x 1 is a must Difference in information [bits] 2 Explains why similar low-snr 4 counterexamples were not previously reported 6 0 x 10 6 Symmetric Trinary input IMMSE ISL Low SNR approximation ɛ (right scale) Px/N0 [db] Difference in information [bits] Difference in information [bits] x 10 6 Symmetric Trinary input IMMSE ISL 0.01 Low SNR approximation ɛ (right scale) Px/N0 [db] x 10 4 Skewed Binary Input Px/N0 [db] Difference in information [bits] Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

28 Counterexamples Demonstration of Low SNR Behaviour Numerical computation agrees with low-snr expansion for up to ɛ 0.02 At low SNR s and moderate to high ISI, I SL I MMSE 10 8 s 2 x κ 2 x Hence, s x 1 or κ x 1 is a must Difference in information [bits] 2 Explains why similar low-snr 4 counterexamples were not previously reported 6 0 x 10 6 Symmetric Trinary input IMMSE ISL Low SNR approximation ɛ (right scale) Px/N0 [db] Difference in information [bits] Difference in information [bits] x 10 6 Symmetric Trinary input IMMSE ISL 0.01 Low SNR approximation ɛ (right scale) Px/N0 [db] x 10 4 Skewed Binary Input Px/N0 [db] Difference in information [bits] Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

29 Counterexamples Higher SNR s Trinary Source Information [bits] P x /N 0 [db] I evaluated via Monte-Carlo method I MMSE I SL I I SL significantly exceeds I MMSE! (but not I) Information [bits] Qualitative explanation The noise k 1 ˆα kx k + ˆm has excess kurtosis proportional to κ x The SL approximating noise term has excess kurtosis 0 (it s Gaussian) Quality of the approximation is expected to deteriorate as κ x grows 14 P x /N Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

30 Counterexamples Higher SNR s Trinary Source Information [bits] P x /N 0 [db] I evaluated via Monte-Carlo method I MMSE I SL I I SL significantly exceeds I MMSE! (but not I) Information [bits] Qualitative explanation The noise k 1 ˆα kx k + ˆm has excess kurtosis proportional to κ x The SL approximating noise term has excess kurtosis 0 (it s Gaussian) Quality of the approximation is expected to deteriorate as κ x grows 14 P x /N Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

31 Counterexamples Higher SNR s Trinary Source Information [bits] P x /N 0 [db] I evaluated via Monte-Carlo method I MMSE I SL I I SL significantly exceeds I MMSE! (but not I) Information [bits] Qualitative explanation The noise k 1 ˆα kx k + ˆm has excess kurtosis proportional to κ x The SL approximating noise term has excess kurtosis 0 (it s Gaussian) Quality of the approximation is expected to deteriorate as κ x grows 14 P x /N Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

32 Counterexamples Higher SNR s Trinary Source Information [bits] P x /N 0 [db] I evaluated via Monte-Carlo method I MMSE I SL I I SL significantly exceeds I MMSE! (but not I) Information [bits] Qualitative explanation The noise k 1 ˆα kx k + ˆm has excess kurtosis proportional to κ x The SL approximating noise term has excess kurtosis 0 (it s Gaussian) Quality of the approximation is expected to deteriorate as κ x grows 14 P x /N Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

33 Counterexamples Higher SNR s Skewed Binary Source [db] I MMSE I SL I 2 4 Information [bits] I MMSE I SL I P x /N 0 [db] I evaluated via Monte-Carlo method This time I SL > I as well as I SL > I MMSE Thus, even the relaxed SLC, I I SL, does not hold for all SNR s Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

34 Counterexamples Higher SNR s Skewed Binary Source [db] I MMSE I SL I 2 4 Information [bits] I MMSE I SL I P x /N 0 [db] I evaluated via Monte-Carlo method This time I SL > I as well as I SL > I MMSE Thus, even the relaxed SLC, I I SL, does not hold for all SNR s Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

35 High SNR Universal High SNR Behavior For a finite entropy input distribution, let R[I] = lim log[h(x 0) I] P x/n 0 P x /N 0 be the rate of convergence to the input entropy For non-trivial ISI, we show that R[I SL ] 1 ( ) 2 dmin g ZF-DFE < ( min 2 ) 2 R[I] where d min is the minimum distance between (unit-power) input values g ZF-DFE = ( Px N 0 ) 1 SNR ZF-DFE is the zero-forcing DFE gain factor min is a free distance associated with optimal equalization Hence, I > I SL for sufficiently high P x /N 0 Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

36 High SNR Universal High SNR Behavior For a finite entropy input distribution, let R[I] = lim log[h(x 0) I] P x/n 0 P x /N 0 be the rate of convergence to the input entropy For non-trivial ISI, we show that R[I SL ] 1 ( ) 2 dmin g ZF-DFE < ( min 2 ) 2 R[I] where d min is the minimum distance between (unit-power) input values g ZF-DFE = ( Px N 0 ) 1 SNR ZF-DFE is the zero-forcing DFE gain factor min is a free distance associated with optimal equalization Hence, I > I SL for sufficiently high P x /N 0 Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

37 High SNR Universal High SNR Behavior For a finite entropy input distribution, let R[I] = lim log[h(x 0) I] P x/n 0 P x /N 0 be the rate of convergence to the input entropy For non-trivial ISI, we show that R[I SL ] 1 ( ) 2 dmin g ZF-DFE < ( min 2 ) 2 R[I] where d min is the minimum distance between (unit-power) input values g ZF-DFE = ( Px N 0 ) 1 SNR ZF-DFE is the zero-forcing DFE gain factor min is a free distance associated with optimal equalization Hence, I > I SL for sufficiently high P x /N 0 Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

38 High SNR Outline of proof Convergence Rate of I SL Information-Estimation Identity for H(x 0 ) I SL Recall that I SL = I x (SNR MMSE-DFE-U ) By [Guo-Shamai-Verdú 05] (assuming Ex 2 = 1), where H (x) I x (snr) = 1 2 snr mmse x (γ) dγ mmse x (γ) E (x E [x γx + n]) 2 with n N (0, 1) and independent of x We therefore need to find an exponentially tight lower bound for mmse x Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

39 High SNR Outline of proof Convergence Rate of I SL Information-Estimation Identity for H(x 0 ) I SL Recall that I SL = I x (SNR MMSE-DFE-U ) By [Guo-Shamai-Verdú 05] (assuming Ex 2 = 1), where H (x) I x (snr) = 1 2 snr mmse x (γ) dγ mmse x (γ) E (x E [x γx + n]) 2 with n N (0, 1) and independent of x We therefore need to find an exponentially tight lower bound for mmse x Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

40 High SNR Outline of proof Convergence Rate of I SL Information-Estimation Identity for H(x 0 ) I SL Recall that I SL = I x (SNR MMSE-DFE-U ) By [Guo-Shamai-Verdú 05] (assuming Ex 2 = 1), where H (x) I x (snr) = 1 2 snr mmse x (γ) dγ mmse x (γ) E (x E [x γx + n]) 2 with n N (0, 1) and independent of x We therefore need to find an exponentially tight lower bound for mmse x Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

41 High SNR Lower bound on mmse x (γ) Outline of proof Convergence Rate of I SL We construct a Genie 1 G distributed on {0, 1} such that given G = 1, x is uniformly distributed two values with distance d min We have, mmse x (γ) Pr (G = 1) mmse x G=1 (γ) and mmse x G=1 (γ) = ( dmin ) ( 2 (dmin ) 2 mmse b γ) 2 2 where b is uniformly distributed on { 1, 1} Finally, mmse b (ρ) 2Q ( ρ) 1 This is based on the analysis in [Lozano-Tulino-Verdú 06], but slightly altered to accommodate non-uniformly distributed inputs Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

42 High SNR Lower bound on mmse x (γ) Outline of proof Convergence Rate of I SL We construct a Genie 1 G distributed on {0, 1} such that given G = 1, x is uniformly distributed two values with distance d min We have, mmse x (γ) Pr (G = 1) mmse x G=1 (γ) and mmse x G=1 (γ) = ( dmin ) ( 2 (dmin ) 2 mmse b γ) 2 2 where b is uniformly distributed on { 1, 1} Finally, mmse b (ρ) 2Q ( ρ) 1 This is based on the analysis in [Lozano-Tulino-Verdú 06], but slightly altered to accommodate non-uniformly distributed inputs Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

43 High SNR Lower bound on mmse x (γ) Outline of proof Convergence Rate of I SL We construct a Genie 1 G distributed on {0, 1} such that given G = 1, x is uniformly distributed two values with distance d min We have, mmse x (γ) Pr (G = 1) mmse x G=1 (γ) and mmse x G=1 (γ) = ( dmin ) ( 2 (dmin ) 2 mmse b γ) 2 2 where b is uniformly distributed on { 1, 1} Finally, mmse b (ρ) 2Q ( ρ) 1 This is based on the analysis in [Lozano-Tulino-Verdú 06], but slightly altered to accommodate non-uniformly distributed inputs Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

44 High SNR Putting everything together Outline of proof Convergence Rate of I SL Using the above results, ( C H (x 0 ) I SL exp 1 Px /N 0 2 ( dmin for some C > 0 and sufficiently high P x /N 0 It is straightforward (but technical) to show that SNR MMSE-DFE-U P x N 0 g ZF-DFE + K for some K, ε > 0 and sufficiently high P x /N 0 This establishes R[I SL ] = lim log[h(x 0) I SL ] P x/n 0 P x /N 0 2 ) 2 SNR MMSE-DFE-U) ( ) 1 ε Px N 0 ( dmin 2 ) 2 g ZF-DFE Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

45 High SNR Putting everything together Outline of proof Convergence Rate of I SL Using the above results, ( C H (x 0 ) I SL exp 1 Px /N 0 2 ( dmin for some C > 0 and sufficiently high P x /N 0 It is straightforward (but technical) to show that SNR MMSE-DFE-U P x N 0 g ZF-DFE + K for some K, ε > 0 and sufficiently high P x /N 0 This establishes R[I SL ] = lim log[h(x 0) I SL ] P x/n 0 P x /N 0 2 ) 2 SNR MMSE-DFE-U) ( ) 1 ε Px N 0 ( dmin 2 ) 2 g ZF-DFE Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

46 High SNR Putting everything together Outline of proof Convergence Rate of I SL Using the above results, ( C H (x 0 ) I SL exp 1 Px /N 0 2 ( dmin for some C > 0 and sufficiently high P x /N 0 It is straightforward (but technical) to show that SNR MMSE-DFE-U P x N 0 g ZF-DFE + K for some K, ε > 0 and sufficiently high P x /N 0 This establishes R[I SL ] = lim log[h(x 0) I SL ] P x/n 0 P x /N 0 2 ) 2 SNR MMSE-DFE-U) ( ) 1 ε Px N 0 ( dmin 2 ) 2 g ZF-DFE Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

47 High SNR Outline of proof Convergence Rate of I Data Processing with the ML Sequence Detector Recall that I = I(x 0 ; y x 1 ) Since conditioning decreases entropy H (x 0 ) I = H ( x 0 y, x 1 ) ( H x0 ˆx ML ) 0 } where {ˆx ML i is the maximum likelihood sequence estimate of i= x given y By Fano s inequality H ( x 0 ˆx ML ) ( ( 0 h2 Pr x0 ˆx ML )) ( 0 + Pr x0 ˆx ML ) 0 log X with h 2 (x) the binary entropy function and X the input alphabet cardinality Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

48 High SNR Outline of proof Convergence Rate of I Data Processing with the ML Sequence Detector Recall that I = I(x 0 ; y x 1 ) Since conditioning decreases entropy H (x 0 ) I = H ( x 0 y, x 1 ) ( H x0 ˆx ML ) 0 } where {ˆx ML i is the maximum likelihood sequence estimate of i= x given y By Fano s inequality H ( x 0 ˆx ML ) ( ( 0 h2 Pr x0 ˆx ML )) ( 0 + Pr x0 ˆx ML ) 0 log X with h 2 (x) the binary entropy function and X the input alphabet cardinality Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

49 High SNR Outline of proof Convergence Rate of I Data Processing with the ML Sequence Detector Recall that I = I(x 0 ; y x 1 ) Since conditioning decreases entropy H (x 0 ) I = H ( x 0 y, x 1 ) ( H x0 ˆx ML ) 0 } where {ˆx ML i is the maximum likelihood sequence estimate of i= x given y By Fano s inequality H ( x 0 ˆx ML ) ( ( 0 h2 Pr x0 ˆx ML )) ( 0 + Pr x0 ˆx ML ) 0 log X with h 2 (x) the binary entropy function and X the input alphabet cardinality Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

50 High SNR Forney s Error Probability Bound Outline of proof Convergence Rate of I In [Forney 72] it was shown that Pr ( x 0 ˆx ML 0 ) K Q P x N 0 ( min 2 ) 2 for some K > 0 and 2 min = inf N 1 min x N 1 0, x N 1 0 s.t. x 0 x 0, x N 1 x N 1 ( ) 2 x N 1 0, x N 1 0 where ( ) 2 x N 1 0, x N 1 0 = L+N 2 k=0 N 1 2 (x l x l ) h k l is the Euclidean distance between finite sequences h x, h x l=0 Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

51 High SNR Outline of proof Convergence Rate of I Bounding min with g ZF-DFE Putting Fano and Forney together, we establish R[I] = lim log[h(x 0) I] 1 P x/n 0 P x /N 0 2 ( ) 2 min Finally, notice that for any non-trivial ISI channel (L > 1) ( ) 2 x N 1 0, x N 1 0 (x 0 x 0 ) h (x N 1 x N 1 ) h L 1 2 ( d 2 min h h L 1 2) > d 2 min h 0 2 Assuming w.l.o.g. h0 L 1 is minimal phase, { 1 π ( h 0 = exp log H(θ) 2) } dθ 2π π Hence, 2 min > d2 min g ZF-DFE 2 = g ZF-DFE Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

52 High SNR Outline of proof Convergence Rate of I Bounding min with g ZF-DFE Putting Fano and Forney together, we establish R[I] = lim log[h(x 0) I] 1 P x/n 0 P x /N 0 2 ( ) 2 min Finally, notice that for any non-trivial ISI channel (L > 1) ( ) 2 x N 1 0, x N 1 0 (x 0 x 0 ) h (x N 1 x N 1 ) h L 1 2 ( d 2 min h h L 1 2) > d 2 min h 0 2 Assuming w.l.o.g. h0 L 1 is minimal phase, { 1 π ( h 0 = exp log H(θ) 2) } dθ 2π π Hence, 2 min > d2 min g ZF-DFE 2 = g ZF-DFE Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

53 High SNR Outline of proof Convergence Rate of I Bounding min with g ZF-DFE Putting Fano and Forney together, we establish R[I] = lim log[h(x 0) I] 1 P x/n 0 P x /N 0 2 ( ) 2 min Finally, notice that for any non-trivial ISI channel (L > 1) ( ) 2 x N 1 0, x N 1 0 (x 0 x 0 ) h (x N 1 x N 1 ) h L 1 2 ( d 2 min h h L 1 2) > d 2 min h 0 2 Assuming w.l.o.g. h0 L 1 is minimal phase, { 1 π ( h 0 = exp log H(θ) 2) } dθ 2π π Hence, 2 min > d2 min g ZF-DFE 2 = g ZF-DFE Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

54 Conclusion Conclusion Conjectured in 1996, the SLC I MMSE I SL is now finally disproved I I SL also doesn t hold in general (skewed binary input) However, it will hold above some SNR threshold In most cases, it seems this threshold is 0... Open issues Can a threshold independent on the ISI channel be found? For conventional inputs, such as BPSK, can the threshold be proven to be very low or even 0? A new use for I SL : upper bound on the OFDM information rate w/ i.i.d. inputs more details at IZS 2014 (or online)! Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

55 Conclusion Conclusion Conjectured in 1996, the SLC I MMSE I SL is now finally disproved I I SL also doesn t hold in general (skewed binary input) However, it will hold above some SNR threshold In most cases, it seems this threshold is 0... Open issues Can a threshold independent on the ISI channel be found? For conventional inputs, such as BPSK, can the threshold be proven to be very low or even 0? A new use for I SL : upper bound on the OFDM information rate w/ i.i.d. inputs more details at IZS 2014 (or online)! Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

56 Conclusion Conclusion Conjectured in 1996, the SLC I MMSE I SL is now finally disproved I I SL also doesn t hold in general (skewed binary input) However, it will hold above some SNR threshold In most cases, it seems this threshold is 0... Open issues Can a threshold independent on the ISI channel be found? For conventional inputs, such as BPSK, can the threshold be proven to be very low or even 0? A new use for I SL : upper bound on the OFDM information rate w/ i.i.d. inputs more details at IZS 2014 (or online)! Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

57 References I References D.M. Arnold, H.A. Loeliger, P.O. Vontobel, A. Kavcic and W. Zeng Simulation-based computation of information rates for channels with memory Information Theory, IEEE Transactions on, 52(8): , S. Shamai, L.H. Ozarow, and A.D. Wyner Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputs Information Theory, IEEE Transactions on, 37(6): , S. Shamai and R. Laroia The intersymbol interference channel: Lower bounds on capacity and channel precoding loss Information Theory, IEEE Transactions on, 42(5): , S. Jeong and J. Moon Easily computed lower bounds on the information rate of intersymbol interference channels Information Theory, IEEE Transactions on, 58(2): , E. Abbe and L. Zheng A coordinate system for Gaussian Networks Information Theory, IEEE Transactions on, 58(2): , Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

58 References II References Dongning Guo, Yihong Wu, Shlomo Shamai, and Sergio Verdú Estimation in gaussian noise: Properties of the minimum mean-square error Information Theory, IEEE Transactions on, 57(4): , J.G. Proakis Digital communications McGraw-hill, D. Guo, S. Shamai, and S. Verdú Mutual information and minimum mean-square error in gaussian channels Information Theory, IEEE Transactions on, 51(4): , A. Lozano, A.M. Tulino, and S. Verdú Optimum power allocation for parallel gaussian channels with arbitrary input distributions Information Theory, IEEE Transactions on, 52(7): , G. Forney Jr. Maximum-likelihood sequence estimation of digital sequences in the presence of intersymbol interference Information Theory, IEEE Transactions on, 18(3): , Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

59 Thank You Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

60 Yair Carmon and Shlomo Shamai (Shitz) On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel Abstract: The approximation proposed by Shamai and Laroia for the achievable rate in the intersymbol interference (ISI) channel with fixed i.i.d. inputs is considered. We investigate the conjecture that this approximation is a lower bound on the achievable rate. This conjecture is a slightly relaxed version of the original Shamai-Laroia conjecture, which has recently been disproved. It is shown that for any discrete input distribution and ISI channel, the lower bound holds in the high-snr regime. Numerical evidence indicates, however, that even this relaxed version of the conjecture does not hold in general. Y. Carmon and S. Shamai On the Shamai-Laroia Approximation ITA / 31

Lower Bounds and Approximations for the Information Rate of the ISI Channel

Lower Bounds and Approximations for the Information Rate of the ISI Channel Lower Bounds and Approximations for the Information Rate of the ISI Channel Yair Carmon and Shlomo Shamai Abstract arxiv:4.48v [cs.it] 7 Jan 4 We consider the discrete-time intersymbol interference ISI

More information

Computation of Information Rates from Finite-State Source/Channel Models

Computation of Information Rates from Finite-State Source/Channel Models Allerton 2002 Computation of Information Rates from Finite-State Source/Channel Models Dieter Arnold arnold@isi.ee.ethz.ch Hans-Andrea Loeliger loeliger@isi.ee.ethz.ch Pascal O. Vontobel vontobel@isi.ee.ethz.ch

More information

Comparison of the Achievable Rates in OFDM and Single Carrier Modulation with I.I.D. Inputs

Comparison of the Achievable Rates in OFDM and Single Carrier Modulation with I.I.D. Inputs Comparison of the Achievable Rates in OFDM and Single Carrier Modulation with I.I.D. Inputs Yair Carmon, Shlomo Shamai and Tsachy Weissman arxiv:36.578v4 [cs.it] Nov 4 Abstract We compare the maimum achievable

More information

Interactions of Information Theory and Estimation in Single- and Multi-user Communications

Interactions of Information Theory and Estimation in Single- and Multi-user Communications Interactions of Information Theory and Estimation in Single- and Multi-user Communications Dongning Guo Department of Electrical Engineering Princeton University March 8, 2004 p 1 Dongning Guo Communications

More information

much more on minimax (order bounds) cf. lecture by Iain Johnstone

much more on minimax (order bounds) cf. lecture by Iain Johnstone much more on minimax (order bounds) cf. lecture by Iain Johnstone http://www-stat.stanford.edu/~imj/wald/wald1web.pdf today s lecture parametric estimation, Fisher information, Cramer-Rao lower bound:

More information

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes Shannon meets Wiener II: On MMSE estimation in successive decoding schemes G. David Forney, Jr. MIT Cambridge, MA 0239 USA forneyd@comcast.net Abstract We continue to discuss why MMSE estimation arises

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

Binary Transmissions over Additive Gaussian Noise: A Closed-Form Expression for the Channel Capacity 1

Binary Transmissions over Additive Gaussian Noise: A Closed-Form Expression for the Channel Capacity 1 5 Conference on Information Sciences and Systems, The Johns Hopkins University, March 6 8, 5 inary Transmissions over Additive Gaussian Noise: A Closed-Form Expression for the Channel Capacity Ahmed O.

More information

MMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only

MMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only MMSE Dimension Yihong Wu Department of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú Department of Electrical Engineering Princeton University

More information

On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities

On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities Alex Dytso, Ronit Bustin, Daniela Tuninetti, Natasha Devroye, H. Vincent Poor, and Shlomo Shamai (Shitz) Outline Notation

More information

Asymptotic Capacity Bounds for Magnetic Recording. Raman Venkataramani Seagate Technology (Joint work with Dieter Arnold)

Asymptotic Capacity Bounds for Magnetic Recording. Raman Venkataramani Seagate Technology (Joint work with Dieter Arnold) Asymptotic Capacity Bounds for Magnetic Recording Raman Venkataramani Seagate Technology (Joint work with Dieter Arnold) Outline Problem Statement Signal and Noise Models for Magnetic Recording Capacity

More information

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback Vincent Y. F. Tan (NUS) Joint work with Silas L. Fong (Toronto) 2017 Information Theory Workshop, Kaohsiung,

More information

Recent Results on Input-Constrained Erasure Channels

Recent Results on Input-Constrained Erasure Channels Recent Results on Input-Constrained Erasure Channels A Case Study for Markov Approximation July, 2017@NUS Memoryless Channels Memoryless Channels Channel transitions are characterized by time-invariant

More information

Bounds on the Information Rate for Sparse Channels with Long Memory and i.u.d. Inputs

Bounds on the Information Rate for Sparse Channels with Long Memory and i.u.d. Inputs IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 59, NO. 2, DECEMBER 2 3343 Bounds on the Information Rate for Sparse Channels with Long Memory and i.u.d. Inputs Andreja Radosevic, Student Member, IEEE, Dario

More information

Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error

Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error Dongning Guo, Yihong Wu, Shlomo Shamai (Shitz), and Sergio Verdú Abstract arxiv:1004.333v1 [cs.it] 0 Apr 010 Consider the minimum

More information

On the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications

On the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications On the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications Alex Dytso, Ronit Bustin, Daniela Tuninetti, Natasha Devroye, H. Vincent Poor, and Shlomo Shamai (Shitz) Notation X 2 R n,

More information

A General Formula for Compound Channel Capacity

A General Formula for Compound Channel Capacity A General Formula for Compound Channel Capacity Sergey Loyka, Charalambos D. Charalambous University of Ottawa, University of Cyprus ETH Zurich (May 2015), ISIT-15 1/32 Outline 1 Introduction 2 Channel

More information

Coding over Interference Channels: An Information-Estimation View

Coding over Interference Channels: An Information-Estimation View Coding over Interference Channels: An Information-Estimation View Shlomo Shamai Department of Electrical Engineering Technion - Israel Institute of Technology Information Systems Laboratory Colloquium

More information

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 8: Differential entropy Chapter 8 outline Motivation Definitions Relation to discrete entropy Joint and conditional differential entropy Relative entropy and mutual information Properties AEP for

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS

NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS Justin Dauwels Dept. of Information Technology and Electrical Engineering ETH, CH-8092 Zürich, Switzerland dauwels@isi.ee.ethz.ch

More information

Wideband Fading Channel Capacity with Training and Partial Feedback

Wideband Fading Channel Capacity with Training and Partial Feedback Wideband Fading Channel Capacity with Training and Partial Feedback Manish Agarwal, Michael L. Honig ECE Department, Northwestern University 145 Sheridan Road, Evanston, IL 6008 USA {m-agarwal,mh}@northwestern.edu

More information

One Lesson of Information Theory

One Lesson of Information Theory Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

information estimation feedback

information estimation feedback relations between information and estimation in the presence of feedback talk at workshop on: Tsachy Weissman Information and Control in Networks LCCC - Lund Center for Control of Complex Engineering Systems

More information

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS THE UNIVERSITY OF BRITISH COLUMBIA Department of Electrical and Computer Engineering EECE 564 Detection and Estimation of Signals in Noise Final Examination 6 December 2006 This examination consists of

More information

Information, Estimation, and Lookahead in the Gaussian channel

Information, Estimation, and Lookahead in the Gaussian channel Information, Estimation, and Lookahead in the Gaussian channel Kartik Venkat, Tsachy Weissman, Yair Carmon, Shlomo Shamai arxiv:32.267v [cs.it] 8 Feb 23 Abstract We consider mean squared estimation with

More information

Analytical Bounds on Maximum-Likelihood Decoded Linear Codes: An Overview

Analytical Bounds on Maximum-Likelihood Decoded Linear Codes: An Overview Analytical Bounds on Maximum-Likelihood Decoded Linear Codes: An Overview Igal Sason Department of Electrical Engineering, Technion Haifa 32000, Israel Sason@ee.technion.ac.il December 21, 2004 Background

More information

RADIO SYSTEMS ETIN15. Lecture no: Equalization. Ove Edfors, Department of Electrical and Information Technology

RADIO SYSTEMS ETIN15. Lecture no: Equalization. Ove Edfors, Department of Electrical and Information Technology RADIO SYSTEMS ETIN15 Lecture no: 8 Equalization Ove Edfors, Department of Electrical and Information Technology Ove.Edfors@eit.lth.se Contents Inter-symbol interference Linear equalizers Decision-feedback

More information

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16 EE539R: Problem Set 3 Assigned: 24/08/6, Due: 3/08/6. Cover and Thomas: Problem 2.30 (Maimum Entropy): Solution: We are required to maimize H(P X ) over all distributions P X on the non-negative integers

More information

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately

More information

Information Theoretic Imaging

Information Theoretic Imaging Information Theoretic Imaging WU Faculty: J. A. O Sullivan WU Doctoral Student: Naveen Singla Boeing Engineer: James Meany First Year Focus: Imaging for Data Storage Image Reconstruction Data Retrieval

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

On the Optimality of Treating Interference as Noise in Competitive Scenarios

On the Optimality of Treating Interference as Noise in Competitive Scenarios On the Optimality of Treating Interference as Noise in Competitive Scenarios A. DYTSO, D. TUNINETTI, N. DEVROYE WORK PARTIALLY FUNDED BY NSF UNDER AWARD 1017436 OUTLINE CHANNEL MODEL AND PAST WORK ADVANTAGES

More information

MMSE DECISION FEEDBACK EQUALIZER FROM CHANNEL ESTIMATE

MMSE DECISION FEEDBACK EQUALIZER FROM CHANNEL ESTIMATE MMSE DECISION FEEDBACK EQUALIZER FROM CHANNEL ESTIMATE M. Magarini, A. Spalvieri, Dipartimento di Elettronica e Informazione, Politecnico di Milano, Piazza Leonardo da Vinci, 32, I-20133 Milano (Italy),

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

On the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels

On the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels On the Required Accuracy of Transmitter Channel State Information in Multiple Antenna Broadcast Channels Giuseppe Caire University of Southern California Los Angeles, CA, USA Email: caire@usc.edu Nihar

More information

Towards control over fading channels

Towards control over fading channels Towards control over fading channels Paolo Minero, Massimo Franceschetti Advanced Network Science University of California San Diego, CA, USA mail: {minero,massimo}@ucsd.edu Invited Paper) Subhrakanti

More information

Channel Estimation under Asynchronous Packet Interference

Channel Estimation under Asynchronous Packet Interference SUBMITTED TO IEEE TRANSACTION ON SIGNAL PROCESSING, APRIL 2003, REVISED JULY 2003 Channel Estimation under Asynchronous Packet Interference Cristian Budianu and Lang Tong Abstract This paper investigates

More information

BASICS OF DETECTION AND ESTIMATION THEORY

BASICS OF DETECTION AND ESTIMATION THEORY BASICS OF DETECTION AND ESTIMATION THEORY 83050E/158 In this chapter we discuss how the transmitted symbols are detected optimally from a noisy received signal (observation). Based on these results, optimal

More information

A t super-channel. trellis code and the channel. inner X t. Y t. S t-1. S t. S t+1. stages into. group two. one stage P 12 / 0,-2 P 21 / 0,2

A t super-channel. trellis code and the channel. inner X t. Y t. S t-1. S t. S t+1. stages into. group two. one stage P 12 / 0,-2 P 21 / 0,2 Capacity Approaching Signal Constellations for Channels with Memory Λ Aleksandar Kav»cić, Xiao Ma, Michael Mitzenmacher, and Nedeljko Varnica Division of Engineering and Applied Sciences Harvard University

More information

MMSE Decision Feedback Equalization of Pulse Position Modulated Signals

MMSE Decision Feedback Equalization of Pulse Position Modulated Signals SE Decision Feedback Equalization of Pulse Position odulated Signals AG Klein and CR Johnson, Jr School of Electrical and Computer Engineering Cornell University, Ithaca, NY 4853 email: agk5@cornelledu

More information

Determining the Optimal Decision Delay Parameter for a Linear Equalizer

Determining the Optimal Decision Delay Parameter for a Linear Equalizer International Journal of Automation and Computing 1 (2005) 20-24 Determining the Optimal Decision Delay Parameter for a Linear Equalizer Eng Siong Chng School of Computer Engineering, Nanyang Technological

More information

Discrete Memoryless Channels with Memoryless Output Sequences

Discrete Memoryless Channels with Memoryless Output Sequences Discrete Memoryless Channels with Memoryless utput Sequences Marcelo S Pinho Department of Electronic Engineering Instituto Tecnologico de Aeronautica Sao Jose dos Campos, SP 12228-900, Brazil Email: mpinho@ieeeorg

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

Performance of small signal sets

Performance of small signal sets 42 Chapter 5 Performance of small signal sets In this chapter, we show how to estimate the performance of small-to-moderate-sized signal constellations on the discrete-time AWGN channel. With equiprobable

More information

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page. EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported

More information

Generalized Writing on Dirty Paper

Generalized Writing on Dirty Paper Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland

More information

Es e j4φ +4N n. 16 KE s /N 0. σ 2ˆφ4 1 γ s. p(φ e )= exp 1 ( 2πσ φ b cos N 2 φ e 0

Es e j4φ +4N n. 16 KE s /N 0. σ 2ˆφ4 1 γ s. p(φ e )= exp 1 ( 2πσ φ b cos N 2 φ e 0 Problem 6.15 : he received signal-plus-noise vector at the output of the matched filter may be represented as (see (5-2-63) for example) : r n = E s e j(θn φ) + N n where θ n =0,π/2,π,3π/2 for QPSK, and

More information

Lecture 14 February 28

Lecture 14 February 28 EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables

More information

Soft-Output Decision-Feedback Equalization with a Priori Information

Soft-Output Decision-Feedback Equalization with a Priori Information Soft-Output Decision-Feedback Equalization with a Priori Information Renato R. opes and John R. Barry School of Electrical and Computer Engineering Georgia Institute of Technology, Atlanta, Georgia 333-5

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited Ch. 8 Math Preliminaries for Lossy Coding 8.4 Info Theory Revisited 1 Info Theory Goals for Lossy Coding Again just as for the lossless case Info Theory provides: Basis for Algorithms & Bounds on Performance

More information

Estimation-Theoretic Representation of Mutual Information

Estimation-Theoretic Representation of Mutual Information Estimation-Theoretic Representation of Mutual Information Daniel P. Palomar and Sergio Verdú Department of Electrical Engineering Princeton University Engineering Quadrangle, Princeton, NJ 08544, USA {danielp,verdu}@princeton.edu

More information

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST Information Theory Lecture 5 Entropy rate and Markov sources STEFAN HÖST Universal Source Coding Huffman coding is optimal, what is the problem? In the previous coding schemes (Huffman and Shannon-Fano)it

More information

Joint Channel Estimation and Co-Channel Interference Mitigation in Wireless Networks Using Belief Propagation

Joint Channel Estimation and Co-Channel Interference Mitigation in Wireless Networks Using Belief Propagation Joint Channel Estimation and Co-Channel Interference Mitigation in Wireless Networks Using Belief Propagation Yan Zhu, Dongning Guo and Michael L. Honig Northwestern University May. 21, 2008 Y. Zhu, D.

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

Modulation & Coding for the Gaussian Channel

Modulation & Coding for the Gaussian Channel Modulation & Coding for the Gaussian Channel Trivandrum School on Communication, Coding & Networking January 27 30, 2017 Lakshmi Prasad Natarajan Dept. of Electrical Engineering Indian Institute of Technology

More information

Capacity Penalty due to Ideal Zero-Forcing Decision-Feedback Equalization

Capacity Penalty due to Ideal Zero-Forcing Decision-Feedback Equalization Capacity Penalty due to Ideal Zero-Forcing Decision-Feedback Equalization John R. Barry, Edward A. Lee, and David. Messerschmitt John R. Barry, School of Electrical Engineering, eorgia Institute of Technology,

More information

On the Low-SNR Capacity of Phase-Shift Keying with Hard-Decision Detection

On the Low-SNR Capacity of Phase-Shift Keying with Hard-Decision Detection On the Low-SNR Capacity of Phase-Shift Keying with Hard-Decision Detection ustafa Cenk Gursoy Department of Electrical Engineering University of Nebraska-Lincoln, Lincoln, NE 68588 Email: gursoy@engr.unl.edu

More information

Computation of Bit-Error Rate of Coherent and Non-Coherent Detection M-Ary PSK With Gray Code in BFWA Systems

Computation of Bit-Error Rate of Coherent and Non-Coherent Detection M-Ary PSK With Gray Code in BFWA Systems Computation of Bit-Error Rate of Coherent and Non-Coherent Detection M-Ary PSK With Gray Code in BFWA Systems Department of Electrical Engineering, College of Engineering, Basrah University Basrah Iraq,

More information

The Continuing Miracle of Information Storage Technology Paul H. Siegel Director, CMRR University of California, San Diego

The Continuing Miracle of Information Storage Technology Paul H. Siegel Director, CMRR University of California, San Diego The Continuing Miracle of Information Storage Technology Paul H. Siegel Director, CMRR University of California, San Diego 10/15/01 1 Outline The Shannon Statue A Miraculous Technology Information Theory

More information

Capacity of MIMO Systems in Shallow Water Acoustic Channels

Capacity of MIMO Systems in Shallow Water Acoustic Channels Capacity of MIMO Systems in Shallow Water Acoustic Channels Andreja Radosevic, Dario Fertonani, Tolga M. Duman, John G. Proakis, and Milica Stojanovic University of California, San Diego, Dept. of Electrical

More information

Simulation-Based Computation of Information Rates for Channels with Memory

Simulation-Based Computation of Information Rates for Channels with Memory January 12, 2004 Submitted to IEEE Trans. on Information Theory Simulation-Based Computation of Information Rates for Channels with Memory Dieter Arnold, Hans-Andrea Loeliger, Pascal O. Vontobel, Aleksandar

More information

Upper Bounds on the Capacity of Binary Intermittent Communication

Upper Bounds on the Capacity of Binary Intermittent Communication Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,

More information

Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem

Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem Hye Won Chung, Brian M. Sadler, Lizhong Zheng and Alfred O. Hero arxiv:606.09233v2 [cs.it] 28 Sep 207 Abstract In this paper,

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Functional Properties of MMSE

Functional Properties of MMSE Functional Properties of MMSE Yihong Wu epartment of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú epartment of Electrical Engineering

More information

Linear and Nonlinear Iterative Multiuser Detection

Linear and Nonlinear Iterative Multiuser Detection 1 Linear and Nonlinear Iterative Multiuser Detection Alex Grant and Lars Rasmussen University of South Australia October 2011 Outline 1 Introduction 2 System Model 3 Multiuser Detection 4 Interference

More information

Signal Processing for Digital Data Storage (11)

Signal Processing for Digital Data Storage (11) Outline Signal Processing for Digital Data Storage (11) Assist.Prof. Piya Kovintavewat, Ph.D. Data Storage Technology Research Unit Nahon Pathom Rajabhat University Partial-Response Maximum-Lielihood (PRML)

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Revision of Lecture 4

Revision of Lecture 4 Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical

More information

Sparse Superposition Codes for the Gaussian Channel

Sparse Superposition Codes for the Gaussian Channel Sparse Superposition Codes for the Gaussian Channel Florent Krzakala (LPS, Ecole Normale Supérieure, France) J. Barbier (ENS) arxiv:1403.8024 presented at ISIT 14 Long version in preparation Communication

More information

Problem 7.7 : We assume that P (x i )=1/3, i =1, 2, 3. Then P (y 1 )= 1 ((1 p)+p) = P (y j )=1/3, j=2, 3. Hence : and similarly.

Problem 7.7 : We assume that P (x i )=1/3, i =1, 2, 3. Then P (y 1 )= 1 ((1 p)+p) = P (y j )=1/3, j=2, 3. Hence : and similarly. (b) We note that the above capacity is the same to the capacity of the binary symmetric channel. Indeed, if we considerthe grouping of the output symbols into a = {y 1,y 2 } and b = {y 3,y 4 } we get a

More information

ee378a spring 2013 April 1st intro lecture statistical signal processing/ inference, estimation, and information processing Monday, April 1, 13

ee378a spring 2013 April 1st intro lecture statistical signal processing/ inference, estimation, and information processing Monday, April 1, 13 ee378a statistical signal processing/ inference, estimation, and information processing spring 2013 April 1st intro lecture 1 what is statistical signal processing? anything & everything inference, estimation,

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter

Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter Bradford D. Boyle, John MacLaren Walsh, and Steven Weber Modeling & Analysis of Networks Laboratory

More information

Nearest Neighbor Decoding in MIMO Block-Fading Channels With Imperfect CSIR

Nearest Neighbor Decoding in MIMO Block-Fading Channels With Imperfect CSIR IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 3, MARCH 2012 1483 Nearest Neighbor Decoding in MIMO Block-Fading Channels With Imperfect CSIR A. Taufiq Asyhari, Student Member, IEEE, Albert Guillén

More information

A Deterministic Algorithm for the Capacity of Finite-State Channels

A Deterministic Algorithm for the Capacity of Finite-State Channels A Deterministic Algorithm for the Capacity of Finite-State Channels arxiv:1901.02678v1 [cs.it] 9 Jan 2019 Chengyu Wu Guangyue Han Brian Marcus University of Hong Kong University of Hong Kong University

More information

Estimation of the Capacity of Multipath Infrared Channels

Estimation of the Capacity of Multipath Infrared Channels Estimation of the Capacity of Multipath Infrared Channels Jeffrey B. Carruthers Department of Electrical and Computer Engineering Boston University jbc@bu.edu Sachin Padma Department of Electrical and

More information

Data Detection for Controlled ISI. h(nt) = 1 for n=0,1 and zero otherwise.

Data Detection for Controlled ISI. h(nt) = 1 for n=0,1 and zero otherwise. Data Detection for Controlled ISI *Symbol by symbol suboptimum detection For the duobinary signal pulse h(nt) = 1 for n=0,1 and zero otherwise. The samples at the output of the receiving filter(demodulator)

More information

Performance Analysis of Spread Spectrum CDMA systems

Performance Analysis of Spread Spectrum CDMA systems 1 Performance Analysis of Spread Spectrum CDMA systems 16:33:546 Wireless Communication Technologies Spring 5 Instructor: Dr. Narayan Mandayam Summary by Liang Xiao lxiao@winlab.rutgers.edu WINLAB, Department

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

On the Feedback Capacity of Stationary Gaussian Channels

On the Feedback Capacity of Stationary Gaussian Channels On the Feedback Capacity of Stationary Gaussian Channels Young-Han Kim Information Systems Laboratory Stanford University Stanford, CA 94305-950 yhk@stanford.edu Abstract The capacity of stationary additive

More information

On the Secrecy Capacity of the Z-Interference Channel

On the Secrecy Capacity of the Z-Interference Channel On the Secrecy Capacity of the Z-Interference Channel Ronit Bustin Tel Aviv University Email: ronitbustin@post.tau.ac.il Mojtaba Vaezi Princeton University Email: mvaezi@princeton.edu Rafael F. Schaefer

More information

Optimal Power Control for LDPC Codes in Block-Fading Channels

Optimal Power Control for LDPC Codes in Block-Fading Channels IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 59, NO. 7, JULY 2011 1759 Optimal Power Control for LDPC Codes in Block-Fading Channels Gottfried Lechner, Khoa D. Nguyen, Albert Guillén i Fàbregas, and Lars

More information

Approximately achieving the feedback interference channel capacity with point-to-point codes

Approximately achieving the feedback interference channel capacity with point-to-point codes Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used

More information

Error Exponent Region for Gaussian Broadcast Channels

Error Exponent Region for Gaussian Broadcast Channels Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI

More information

Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT. ECE 559 Presentation Hoa Pham Dec 3, 2007

Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT. ECE 559 Presentation Hoa Pham Dec 3, 2007 Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT ECE 559 Presentation Hoa Pham Dec 3, 2007 Introduction MIMO systems provide two types of gains Diversity Gain: each path from a transmitter

More information

Design of MMSE Multiuser Detectors using Random Matrix Techniques

Design of MMSE Multiuser Detectors using Random Matrix Techniques Design of MMSE Multiuser Detectors using Random Matrix Techniques Linbo Li and Antonia M Tulino and Sergio Verdú Department of Electrical Engineering Princeton University Princeton, New Jersey 08544 Email:

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Expectation propagation for signal detection in flat-fading channels

Expectation propagation for signal detection in flat-fading channels Expectation propagation for signal detection in flat-fading channels Yuan Qi MIT Media Lab Cambridge, MA, 02139 USA yuanqi@media.mit.edu Thomas Minka CMU Statistics Department Pittsburgh, PA 15213 USA

More information

Decoupling of CDMA Multiuser Detection via the Replica Method

Decoupling of CDMA Multiuser Detection via the Replica Method Decoupling of CDMA Multiuser Detection via the Replica Method Dongning Guo and Sergio Verdú Dept. of Electrical Engineering Princeton University Princeton, NJ 08544, USA email: {dguo,verdu}@princeton.edu

More information

On Gaussian Interference Channels with Mixed Gaussian and Discrete Inputs

On Gaussian Interference Channels with Mixed Gaussian and Discrete Inputs On Gaussian nterference Channels with Mixed Gaussian and Discrete nputs Alex Dytso Natasha Devroye and Daniela Tuninetti University of llinois at Chicago Chicago L 60607 USA Email: odytso devroye danielat

More information

Draft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor

Draft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor Preliminaries Counterexample Better Use On Limiting Expressions for the Capacity Region of Gaussian Interference Channels Mojtaba Vaezi and H. Vincent Poor Department of Electrical Engineering Princeton

More information

DETECTION theory deals primarily with techniques for

DETECTION theory deals primarily with techniques for ADVANCED SIGNAL PROCESSING SE Optimum Detection of Deterministic and Random Signals Stefan Tertinek Graz University of Technology turtle@sbox.tugraz.at Abstract This paper introduces various methods for

More information

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma

More information