Coding over Interference Channels: An Information-Estimation View

Similar documents
Interference Alignment at Finite SNR for TI channels

On the Minimum Mean p-th Error in Gaussian Noise Channels and its Applications

Interactions of Information Theory and Estimation in Single- and Multi-user Communications

Information Dimension

On the Optimality of Treating Interference as Noise in Competitive Scenarios

On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities

Information Theory Meets Game Theory on The Interference Channel

L interférence dans les réseaux non filaires

On the Secrecy Capacity of the Z-Interference Channel

Lecture 8: MIMO Architectures (II) Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH

MMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only

Interference Decoding for Deterministic Channels Bernd Bandemer, Student Member, IEEE, and Abbas El Gamal, Fellow, IEEE

Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error

On the Shamai-Laroia Approximation for the Information Rate of the ISI Channel

On the Degrees of Freedom of the Finite State Compound MISO Broadcast Channel

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Samah A. M. Ghanem, Member, IEEE, Abstract

EE 4TM4: Digital Communications II Scalar Gaussian Channel

Interference Channel aided by an Infrastructure Relay

i.i.d. Mixed Inputs and Treating Interference as Noise are gdof Optimal for the Symmetric Gaussian Two-user Interference Channel

On Gaussian Interference Channels with Mixed Gaussian and Discrete Inputs

Expectation propagation for symbol detection in large-scale MIMO communications

Simultaneous SDR Optimality via a Joint Matrix Decomp.

Degrees of Freedom invector Interference Channels

Degrees-of-Freedom Robust Transmission for the K-user Distributed Broadcast Channel

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel

A Systematic Approach for Interference Alignment in CSIT-less Relay-Aided X-Networks

Estimation-Theoretic Representation of Mutual Information

Sparse Superposition Codes for the Gaussian Channel

On the Capacity of the Interference Channel with a Relay

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

Capacity Bounds for the K-User Gaussian Interference Channel

Bounds and Capacity Results for the Cognitive Z-interference Channel

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

On Network Interference Management

On Gaussian MIMO Broadcast Channels with Common and Private Messages

Simultaneous Nonunique Decoding Is Rate-Optimal

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Augmented Lattice Reduction for MIMO decoding

arxiv: v1 [cs.it] 4 Jun 2018

Joint Channel Estimation and Co-Channel Interference Mitigation in Wireless Networks Using Belief Propagation

Lecture 7 MIMO Communica2ons

Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes

Signal Estimation in Gaussian Noise: A Statistical Physics Perspective

Lecture 14 February 28

Functional Properties of MMSE

Interactive Interference Alignment

Approximately achieving the feedback interference channel capacity with point-to-point codes

Interference, Cooperation and Connectivity A Degrees of Freedom Perspective

Error Exponent Region for Gaussian Broadcast Channels

Multiuser Capacity in Block Fading Channel

One-Bit LDPC Message Passing Decoding Based on Maximization of Mutual Information

Compute-and-Forward for the Interference Channel: Diversity Precoding

Single-letter Characterization of Signal Estimation from Linear Measurements

Why We Can Not Surpass Capacity: The Matching Condition

Draft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

Information, Estimation, and Lookahead in the Gaussian channel

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Degrees of Freedom for the MIMO Interference Channel REFERENCES

LDPC Codes. Intracom Telecom, Peania

Degrees-of-Freedom for the 4-User SISO Interference Channel with Improper Signaling

Single-User MIMO systems: Introduction, capacity results, and MIMO beamforming

much more on minimax (order bounds) cf. lecture by Iain Johnstone

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Joint FEC Encoder and Linear Precoder Design for MIMO Systems with Antenna Correlation

The Capacity Region of the Cognitive Z-interference Channel with One Noiseless Component

On Generalized EXIT Charts of LDPC Code Ensembles over Binary-Input Output-Symmetric Memoryless Channels

Diversity-Multiplexing Tradeoff of the Two-User Interference Channel

Dispersion of the Gilbert-Elliott Channel

Minimum Repair Bandwidth for Exact Regeneration in Distributed Storage

arxiv: v2 [cs.it] 27 Aug 2016

Lossy Distributed Source Coding

Harnessing Interaction in Bursty Interference Networks

Distributed Data Storage with Minimum Storage Regenerating Codes - Exact and Functional Repair are Asymptotically Equally Efficient

Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT. ECE 559 Presentation Hoa Pham Dec 3, 2007

Capacity Region of Reversely Degraded Gaussian MIMO Broadcast Channel

Lecture 5 Channel Coding over Continuous Channels

The Robustness of Dirty Paper Coding and The Binary Dirty Multiple Access Channel with Common Interference

Appendix B Information theory from first principles

Approximate Capacity of Fast Fading Interference Channels with no CSIT

Generalized Writing on Dirty Paper

Lecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1

Lecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH

Approximate Ergodic Capacity of a Class of Fading Networks

Survey of Interference Channel

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

An Outer Bound for the Gaussian. Interference channel with a relay.

Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels

Decomposing the MIMO Wiretap Channel

Degrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes

Dirty Paper Coding vs. TDMA for MIMO Broadcast Channels

Gaussian Relay Channel Capacity to Within a Fixed Number of Bits

DSP Applications for Wireless Communications: Linear Equalisation of MIMO Channels

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

Optimum Rate Communication by Fast Sparse Superposition Codes

Decoupling of CDMA Multiuser Detection via the Replica Method

Transcription:

Coding over Interference Channels: An Information-Estimation View Shlomo Shamai Department of Electrical Engineering Technion - Israel Institute of Technology Information Systems Laboratory Colloquium (ISLC) Stanford, 7 February 2013 Joint work with Ronit Bustin, Technion Shlomo Shamai (Technion) Stanford, 2013 1 / 38

Outline Outline I-MMSE scalar and vector relations. Good and bad codes. Information versus MMSE disturbance. Degrees-of-Freedom: an MMSE dimension perspective. Outlook. Shlomo Shamai (Technion) Stanford, 2013 2 / 38

I-MMSE Scalar Setting I-MMSE The I-MMSE relation [Guo-Shamai-Verdú, IT 05]. Y = snr X + N X Input signal, Y Output signal, N Gaussian noise N (0, 1), snr Signal-to-Noise Ratio. d dsnr I (X ; Y ) = 1 mmse(x : snr) 2 ( mmse(x Y ) = mmse(x : snr) = mmse(snr) = E X E ( X Y )) 2. Shlomo Shamai (Technion) Stanford, 2013 3 / 38

I-MMSE Scalar Examples I-MMSE - examples I-MMSE: Gaussian Example: X N (0, 1). mmse(x : snr) = E ( X snr 1+snr Y ) 2 = 1 1+snr, I (X ; Y ) = I g (snr) = 1 2log(1 + snr). I-MMSE: Binary Example: X = ±1, symmetric. mmse(x : snr) = 1 I (X ; Y ) = I b (snr) = snr e y2/2 2π tanh(snr snr y) dy e y2/2 2π log cosh(snr snr y) dy Shlomo Shamai (Technion) Stanford, 2013 4 / 38

I-MMSE Scalar Examples I-MMSE - examples I-MMSE: Gaussian Example: X N (0, 1). mmse(x : snr) = E ( X snr 1+snr Y ) 2 = 1 1+snr, I (X ; Y ) = I g (snr) = 1 2log(1 + snr). I-MMSE: Binary Example: X = ±1, symmetric. mmse(x : snr) = 1 I (X ; Y ) = I b (snr) = snr e y2/2 2π tanh(snr snr y) dy e y2/2 2π log cosh(snr snr y) dy Shlomo Shamai (Technion) Stanford, 2013 4 / 38

I-MMSE Scalar Examples d dsnr I (X ; Y ) = 1 mmse(x : snr) 2 1.2 1 0.8 0.6 Gaussian I g snr Binary I b snr 0.4 0.2 Gaussian mmse X g :snr Binary mmse X b :snr 2 4 6 8 10 snr Shlomo Shamai (Technion) Stanford, 2013 5 / 38

I-MMSE The Vector Relation Vector Channel Theorem (Guo-Shamai-Verdú, IT 05) Let Y = snr HX + N. If E X 2 <, X, Y, N N (0, I ) vectors d dsnr I (X; snr HX + N) = 1 2 mmse(hx snr HX + N) = 1 2 mmse(snr) = 1 E H X H E{X Y} 2 2 = 1 2 Tr(HE X(snr)H T ) where E X (snr) is the MMSE matrix of estimating X from Y. Shlomo Shamai (Technion) Stanford, 2013 6 / 38

I-MMSE Gaussian Vector Example Special Case: MIMO channel with Gaussian Inputs Y = snr H X + N X iid standard Gaussian I (X; Y) = 1 ) (I 2 logdet + snrh T H Error covariance matrix: { ( E X X ) ( X X ) } T = ( ) I + snrh T 1 H { HX mmse(snr) = E H X 2}, {( = Tr I + snrh H) T 1 } H T H Shlomo Shamai (Technion) Stanford, 2013 7 / 38

I-MMSE Gaussian Vector Example Special Case: MIMO channel with Gaussian Inputs Y = snr H X + N X iid standard Gaussian I (X; Y) = 1 ) (I 2 logdet + snrh T H Error covariance matrix: { ( E X X ) ( X X ) } T = ( ) I + snrh T 1 H { HX mmse(snr) = E H X 2}, {( = Tr I + snrh H) T 1 } H T H Shlomo Shamai (Technion) Stanford, 2013 7 / 38

I-MMSE Gaussian Vector Example Some Abbreviations γ - designates signal-to-noise ratio. n - length of the input/output vectors. I n (γ) = 1 I (X; Y(γ)) n I (γ) = lim n 1 I (X; Y(γ)) n MMSE cn (γ) = 1 n Tr(E X(γ)) MMSE c (γ) = lim n MMSEcn (γ) Shlomo Shamai (Technion) Stanford, 2013 8 / 38

I-MMSE The Single Crossing Point Property The Single Crossing Point Property Theorem (Guo-Wu-Shamai-Verdú, IT 11) f (γ) (1 + γ) 1 mmse(x : γ) If X is not standard Gaussian, f has at most one zero (Var (X ) > 1). If f (snr 0) = 0, then 1 f (0) 0; 2 f (γ) is strictly increasing on γ [0, snr 0]; 3 f (γ) > 0 for every γ (snr 0, ); and 4 lim f (γ) = 0. γ 0.4 0.2 0.0 0.2 2 4 6 8 10 0.4 0.6 0.8 1.0 Shlomo Shamai (Technion) Stanford, 2013 9 / 38

I-MMSE The Single Crossing Point Property The Single Crossing Point Property Theorem (Guo-Wu-Shamai-Verdú, IT 11) f (γ) (1 + γ) 1 mmse(x : γ) If X is not standard Gaussian, f has at most one zero (Var (X ) > 1). If f (snr 0) = 0, then 1 f (0) 0; 2 f (γ) is strictly increasing on γ [0, snr 0]; 3 f (γ) > 0 for every γ (snr 0, ); and 4 lim f (γ) = 0. γ 0.4 0.2 0.0 0.2 2 4 6 8 10 0.4 0.6 0.8 1.0 Shlomo Shamai (Technion) Stanford, 2013 9 / 38

I-MMSE The Single Crossing Point Property The Single Crossing Point Property Theorem (Guo-Wu-Shamai-Verdú, IT 11) f (γ) (1 + γ) 1 mmse(x : γ) If X is not standard Gaussian, f has at most one zero (Var (X ) > 1). If f (snr 0) = 0, then 1 f (0) 0; 2 f (γ) is strictly increasing on γ [0, snr 0]; 3 f (γ) > 0 for every γ (snr 0, ); and 4 lim f (γ) = 0. γ 0.4 0.2 0.0 0.2 2 4 6 8 10 0.4 0.6 0.8 1.0 Shlomo Shamai (Technion) Stanford, 2013 9 / 38

I-MMSE The Single Crossing Point Property The Single Crossing Point Property Theorem (Guo-Wu-Shamai-Verdú, IT 11) f (γ) (1 + γ) 1 mmse(x : γ) If X is not standard Gaussian, f has at most one zero (Var (X ) > 1). If f (snr 0) = 0, then 1 f (0) 0; 2 f (γ) is strictly increasing on γ [0, snr 0]; 3 f (γ) > 0 for every γ (snr 0, ); and 4 lim f (γ) = 0. γ 0.4 0.2 0.0 0.2 2 4 6 8 10 0.4 0.6 0.8 1.0 Shlomo Shamai (Technion) Stanford, 2013 9 / 38

I-MMSE The Single Crossing Point Property The Single Crossing Point Property Theorem (Guo-Wu-Shamai-Verdú, IT 11) f (γ) (1 + γ) 1 mmse(x : γ) If X is not standard Gaussian, f has at most one zero (Var (X ) > 1). If f (snr 0) = 0, then 1 f (0) 0; 2 f (γ) is strictly increasing on γ [0, snr 0]; 3 f (γ) > 0 for every γ (snr 0, ); and 4 lim f (γ) = 0. γ 0.4 0.2 0.0 0.2 2 4 6 8 10 0.4 0.6 0.8 1.0 Shlomo Shamai (Technion) Stanford, 2013 9 / 38

I-MMSE The Single Crossing Point Property The Single Crossing Point Property - Extension Several MIMO extensions are given in [Bustin, Payaró, Palomar and Shamai, IT 13]. We give here only the simplest extension. Theorem (Bustin, Payaró, Palomar and Shamai, IT 13) The function q(x, σ 2, γ) = σ 2 1 + γσ 2 1 n Tr(E X(γ)) has no nonnegative-to-negative zero crossings and, at most, a single negative-to-nonnegative zero crossing in the range γ [0, ). Moreover, assume snr 0 [0, ) is a negative-to-nonnegative crossing point. Then, q(x, σ 2, 0) 0. q(x, σ 2, γ) is a strictly increasing function in the range γ [0, snr 0). q(x, σ 2, γ) 0 for all γ [snr 0, ). lim γ q(x, σ 2, γ) = 0. Shlomo Shamai (Technion) Stanford, 2013 10 / 38

Application to Codes Basic Concepts Optimal Point-to-Point Codes Looking at code-sequences over the scalar Gaussian channel. Theorem (Peleg, Sanderovich and Shamai, ETT 07) For every capacity achieving code-sequence, C n, over the Gaussian channel, the mutual information, when n, is as follows: 1 I (γ) = lim n n I (X; γx + N) = and the MMSE is: MMSE c (γ) = { 1 2 1 2 log(1 + γ), γ snr log(1 + snr), o/w { 1 1+γ, γ snr 0, o/w Shlomo Shamai (Technion) Stanford, 2013 11 / 38

Application to Codes Basic Concepts Optimal Point-to-Point Codes - Cont. The mutual information (and MMSE) of optimal point-to-point codes follow the behavior of an i.i.d. Gaussian input up to snr. 1 0.9 I opt (γ) mutual information \ MMSE 0.8 0.7 0.6 0.5 0.4 0.3 0.2 MMSE opt (γ) 0.1 0 0 0.5 1 1.5 2 2.5 3 3.5 γ Student Version of MATLAB Area under curve = R Good code must exhibit a threshold (phase transition). Shlomo Shamai (Technion) Stanford, 2013 12 / 38

Application to Codes Exit Chart Exit Chart Extrinsic information transfer (EXIT) chart for good codes behave as A STEP FUNCTION (capacity R). iterative decoding not too strong component codes - relevant observations: Area Theorem [Ashikhmin, Kramer, ten Brink, IT 04]. GEXIT [Measson-Montanari-Richardson-Urbanke, IT 09]. Shlomo Shamai (Technion) Stanford, 2013 13 / 38

Application to Codes Multiterminal Systems What is the effect on an unintended receiver? X R Y? Z Assumption: the unintended receiver, Z, has smaller snr, that is, snr z < snr y. How should we measure the effect (disturbance)? For optimal point-to-point codes both the mutual information and MMSE are completely known. But what about non-optimal code (that do not attain capacity)? Shlomo Shamai (Technion) Stanford, 2013 14 / 38

Application to Codes Bad Codes & The Interference Channel Bad Codes & The Interference Channel Bad Codes: Trade rate versus MMSE (interference) in lower snr (R > capacity) [Bennatan-Shamai-Calderbank, IT-S 11] Interference channels: - Interfering codeword not decodable - Interference Networks with Point-to-Point Codes [Baccelli-El Gamal-Tse, ISIT 11] Soft partial interference cancellation Shlomo Shamai (Technion) Stanford, 2013 15 / 38

Application to Codes Bad Codes Multiterminal Systems Bad Codes Multiterminal Systems: MMSE Bounds I-MMSE Application: deterministic finite length codes - MI-Bound: I-MMSE invoking properties in [Wiechman-Sason, IT 07] - BP-Bound: based on tools in [Richardson-Urbanke, IT 01] Shlomo Shamai (Technion) Stanford, 2013 16 / 38

MMSE Vs. Information Problem Definition - Single MMSE Constraint Problem Definition - Single MMSE Constraint Assuming snr 1 < snr 2 and MMSE c (snr 1 ) is limited to some value. What is the maximum possible rate at snr 2? In other words: max I (snr 2 ) s.t. MMSE c (snr 1 ) β 1 + βsnr 1 for some β [0, 1]. Shlomo Shamai (Technion) Stanford, 2013 17 / 38

MMSE Vs. Information Problem Definition - Single MMSE Constraint Problem Definition - Single MMSE Constraint Assuming snr 1 < snr 2 and MMSE c (snr 1 ) is limited to some value. What is the maximum possible rate at snr 2? In other words: max I (snr 2 ) s.t. MMSE c (snr 1 ) β 1 + βsnr 1 for some β [0, 1]. Shlomo Shamai (Technion) Stanford, 2013 17 / 38

MMSE Vs. Information Superposition Codes Superposition codes The mutual information, I (γ), and MMSE c (γ) of an optimal Gaussian superposition code are known exactly, for all γ. Theorem (Merhav, Guo and Shamai, IT 10) A superposition codebook designed for (snr 1, snr 2) with the rate-splitting coefficient β < 1 has the following mutual information: 1 log (1 + γ), if 0 γ < snr1 2 ( ) 1 I (γ) = log 1+snr1 2 1+βsnr 1 + 1 log (1 + βγ), if snr1 γ snr2 2 ( ) 1 log 1+snr1 2 1+βsnr 1 + 1 log (1 + βsnr2), if snr2 < γ 2 and the following MMSE c (γ): MMSE c (γ) = 1 1+γ, 0 γ < snr1 β, 1+βγ snr1 γ snr2 0, snr 2 < γ Shlomo Shamai (Technion) Stanford, 2013 18 / 38

MMSE Vs. Information Superposition Codes Superposition codes - Example mutual information \ MMSE 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 snr 1 = 2 snr 2 = 2.5 β = 0.4 I opt (γ) I(γ) MMSE c MMSE opt 0.1 0 0 0.5 1 1.5 2 2.5 3 3.5 γ Shlomo Shamai (Technion) Stanford, 2013 19 / 38

MMSE Vs. Information Solution - Single MMSE Constraint Solution - Single MMSE Constraint Theorem (Bustin and Shamai, IT 13) Assuming snr 1 < snr 2 the solution of the following optimization problem, max I (snr 2 ) s.t. MMSE c (snr 1 ) β 1 + βsnr 1 for some β [0, 1], is the following I (snr 2 ) = 1 2 log (1 + βsnr 2) + 1 ( ) 1 + 2 log snr1 1 + βsnr 1 and is attainable when using the optimal Gaussian superposition codebook designed for (snr 1, snr 2 ) with a rate-splitting coefficient β. Shlomo Shamai (Technion) Stanford, 2013 20 / 38

MMSE Vs. Information Proof Sketch Proof Sketch Optimal Gaussian superposition codebook complies with the above MMSE constraint and attains the maximum rate. We need a tight upper bound on the rate. We prove an equivalent claim: Assume a code of rate R c = 1 log (1 + αsnr2), designed for reliable transmission at 2 snr 2. Lower bound MMSE c (γ). αsnr 2 γ 1 The lower bound is trivially zero using the optimal Gaussian codebook designed for αsnr 2. This is equivalent to setting β = 0. γ αsnr 2 I (snr 2) I (γ) I (snr 2) 1 2 log (1 + γ) = Rc 1 log (1 + γ) 2 Shlomo Shamai (Technion) Stanford, 2013 21 / 38

MMSE Vs. Information Proof Sketch Proof Sketch Optimal Gaussian superposition codebook complies with the above MMSE constraint and attains the maximum rate. We need a tight upper bound on the rate. We prove an equivalent claim: Assume a code of rate R c = 1 log (1 + αsnr2), designed for reliable transmission at 2 snr 2. Lower bound MMSE c (γ). αsnr 2 γ 1 The lower bound is trivially zero using the optimal Gaussian codebook designed for αsnr 2. This is equivalent to setting β = 0. γ αsnr 2 I (snr 2) I (γ) I (snr 2) 1 2 log (1 + γ) = Rc 1 log (1 + γ) 2 Shlomo Shamai (Technion) Stanford, 2013 21 / 38

MMSE Vs. Information Proof Sketch Proof Sketch Optimal Gaussian superposition codebook complies with the above MMSE constraint and attains the maximum rate. We need a tight upper bound on the rate. We prove an equivalent claim: Assume a code of rate R c = 1 log (1 + αsnr2), designed for reliable transmission at 2 snr 2. Lower bound MMSE c (γ). αsnr 2 γ 1 The lower bound is trivially zero using the optimal Gaussian codebook designed for αsnr 2. This is equivalent to setting β = 0. γ αsnr 2 I (snr 2) I (γ) I (snr 2) 1 2 log (1 + γ) = Rc 1 log (1 + γ) 2 Shlomo Shamai (Technion) Stanford, 2013 21 / 38

MMSE Vs. Information Proof Sketch Proof Sketch - Cont. Using the I-MMSE relationship: 1 2 snr2 γ MMSE c (τ) dτ 1 2 log (1 + αsnr 2) 1 log (1 + γ) 2 Defining d through the following equality: 1 2 log (1 + αsnr 2) 1 2 log (1 + γ) = 1 2 log (1 + dsnr 2) 1 log (1 + dγ) 2 we have: 1 2 snr2 γ MMSE c (τ) dτ 1 2 log (1 + αsnr 2) 1 log (1 + γ) 2 = 1 2 log (1 + dsnr 2) 1 log (1 + dγ) 2 = 1 2 snr2 mmse G (τ) dτ, X G N (0, d), i.i.d. γ Shlomo Shamai (Technion) Stanford, 2013 22 / 38

MMSE Vs. Information Proof Sketch Proof Sketch - Cont. Using the I-MMSE relationship: 1 2 snr2 γ MMSE c (τ) dτ 1 2 log (1 + αsnr 2) 1 log (1 + γ) 2 Defining d through the following equality: 1 2 log (1 + αsnr 2) 1 2 log (1 + γ) = 1 2 log (1 + dsnr 2) 1 log (1 + dγ) 2 we have: 1 2 snr2 γ MMSE c (τ) dτ 1 2 log (1 + αsnr 2) 1 log (1 + γ) 2 = 1 2 log (1 + dsnr 2) 1 log (1 + dγ) 2 = 1 2 snr2 mmse G (τ) dτ, X G N (0, d), i.i.d. γ Shlomo Shamai (Technion) Stanford, 2013 22 / 38

MMSE Vs. Information Proof Sketch Proof Sketch - Cont. Using the single crossing point property and the above inequality we can conclude: The single crossing point of mmse G (τ) and MMSE c (τ), if exists, will occur somewhere in the region (γ, ). Thus, we have the following lower bound: MMSE c (γ) d(γ) 1 + d(γ)γ = αsnr2 γ 1 snr 2 γ 1 + γ Specifically for γ = snr 1 we obtain MMSE c (snr 1) αsnr2 snr1 snr 2 snr 1 1 1 + snr 1. Deriving α as a function of the constraining β, and substituting it in R c = 1 log (1 + αsnr2) results with the superposition rate. 2 Shlomo Shamai (Technion) Stanford, 2013 23 / 38

MMSE Vs. Information Proof Sketch Proof Sketch - Cont. Using the single crossing point property and the above inequality we can conclude: The single crossing point of mmse G (τ) and MMSE c (τ), if exists, will occur somewhere in the region (γ, ). Thus, we have the following lower bound: MMSE c (γ) d(γ) 1 + d(γ)γ = αsnr2 γ 1 snr 2 γ 1 + γ Specifically for γ = snr 1 we obtain MMSE c (snr 1) αsnr2 snr1 snr 2 snr 1 1 1 + snr 1. Deriving α as a function of the constraining β, and substituting it in R c = 1 log (1 + αsnr2) results with the superposition rate. 2 Shlomo Shamai (Technion) Stanford, 2013 23 / 38

MMSE Vs. Information Proof Sketch The Effect at Other snrs Theorem (Bustin and Shamai, IT 13) From the set of reliable codes of rate R c = 1 2 log (1 + βsnr 2) + 1 ( ) 1 + 2 log snr1 1 + βsnr 1 complying with the MMSE constraint at snr 1 : MMSE c (snr 1 ) β 1 + βsnr 1 the superposition codebook provides the minimum MMSE for all snrs. Shlomo Shamai (Technion) Stanford, 2013 24 / 38

MMSE Vs. Information K MMSE Constraints Extension to K MMSE Constraints Theorem (Bustin and Shamai, IT 13) Assuming snr 0 < snr 1 <... < snr K the solution of, max I (snr K ) s.t. MMSE c (snr i ) β i 1 + β i snr i, i {0, 1,..., K 1} for some positive β i, i {0, 1,..., K 1} such that K 1 i=0 β i 1 and β K 1 < β K 2 <... < β 1 < β 0 is the following ( I (snr K ) = 1 2 log 1 + snr 0 1 + β 0snr 0 K 1 j=1 1 + β j 1 snr j 1 + β j snr j ) + 1 2 log (1 + β K 1snr K ) and is attainable when using the optimal K-layers Gaussian superposition codebook designed for (snr 0, snr 1,..., snr K ) with rate-splitting coefficients (β 0,..., β K 1 ). Additional constraints of the following form: MMSE c (snr l ) β l 1+β l snr l for snr i 1 snr l snr i, when β l β i 1, do not affect the above result. Shlomo Shamai (Technion) Stanford, 2013 25 / 38

MMSE Vs. Information K MMSE Constraints K-layer Superposition Code, K = 3 mutual information \ MMSE 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 snr 0 = 0.8 snr 1 = 1.8 snr 2 = 2.5 β 0 = 0.6 β 1 = 0.4 MMSE c I οpt (X;Y) I c (X;Y) MMSE οpt 0.1 0 0 0.5 1 1.5 2 2.5 3 γ Shlomo Shamai (Technion) Stanford, 2013 26 / 38

MMSE Vs. Information Coding Interpretation Coding Interpretation Using MMSE as a disturbance measure, and the resultant optimality of superposition coding, provides an intuitive engineering support for Han-Kobayashi coding strategies over the Gaussian Interference Channel. While Han-Kobayashi is known to be good over the two users Gaussian interference channel [Etkin-Tse-Wang, IT 08] (capacity within 1 bit), and can not be improved by ML decoders (with random coding) [Bandemer-El Gamal-Kim, arxiv 12], this is not the case for many users Gaussian interference channel. For example: Gaussian superposition coding yields DoF=1, immaterial of the number of users for a fully connected Gaussian interference channel. Can an I-MMSE perspective be useful, for many users Gaussian interference channels? Shlomo Shamai (Technion) Stanford, 2013 27 / 38

Interference Alignment Basic DoF Concepts Basic DoF Concepts: Information/MMSE Dimension Information dimension [Renyi, Acta-Math. Hung 59], [Wu-Verdú, IT 10] X -real valued random variable: d(x ) = lim m - d(x ) < = Elog(1 + X ) < - vector generalizations MMSE dimension [Wu-Verdú, IT 11] H( X m) logm, X m mx m Y = snrx + N, D(X ) D(X ) = lim snr mmse(x : snr), mmse = + o ( ) 1 snr snr snr - d(x ) = D(X ) for X discrete, continuous, mixture I (X, snr) - lim = d(x ), d(x ) <, I (X, snr) I (X ; Y ) snr 1 logsnr 2 Shlomo Shamai (Technion) Stanford, 2013 28 / 38

Interference Alignment Basic DoF Concepts Basic DoF Concepts: Information/MMSE Dimension Information dimension [Renyi, Acta-Math. Hung 59], [Wu-Verdú, IT 10] X -real valued random variable: d(x ) = lim m - d(x ) < = Elog(1 + X ) < - vector generalizations MMSE dimension [Wu-Verdú, IT 11] H( X m) logm, X m mx m Y = snrx + N, D(X ) D(X ) = lim snr mmse(x : snr), mmse = + o ( ) 1 snr snr snr - d(x ) = D(X ) for X discrete, continuous, mixture I (X, snr) - lim = d(x ), d(x ) <, I (X, snr) I (X ; Y ) snr 1 logsnr 2 Shlomo Shamai (Technion) Stanford, 2013 28 / 38

Interference Alignment Interference Channel Interference Channel Y i = K snrhij X j + N i, H = {h ij }, (i, j) = 1,... K j=1 {X i } independent users signals, E(Xi 2 ) 1, N i N (0, 1), C(H, snr) - capacity region { K } C(H, snr) R i, R K C(H, snr) sum rate capacity i=1 DoF(H) = lim snr D(H) = DoF - region C(H, snr) 1 2 logsnr = sum DoF Central results: [Host-Madsen-Nosratinia, ISIT 05], [Cadambe-Jafar, IT 08], [Etkin-Ordentlich, IT 09], [Motahari-Gharan-Maddah Ali-Khandani, IT 11], Tutorial-[Jafar, FnT 11] Unlike previously believed DoF(H) almost surely for all H is K/2 and not 1! - central tool: number theoretic Diophantine Approximation Theory Shlomo Shamai (Technion) Stanford, 2013 29 / 38

Interference Alignment Interference Channel Interference Channel Y i = K snrhij X j + N i, H = {h ij }, (i, j) = 1,... K j=1 {X i } independent users signals, E(Xi 2 ) 1, N i N (0, 1), C(H, snr) - capacity region { K } C(H, snr) R i, R K C(H, snr) sum rate capacity i=1 DoF(H) = lim snr D(H) = DoF - region C(H, snr) 1 2 logsnr = sum DoF Central results: [Host-Madsen-Nosratinia, ISIT 05], [Cadambe-Jafar, IT 08], [Etkin-Ordentlich, IT 09], [Motahari-Gharan-Maddah Ali-Khandani, IT 11], Tutorial-[Jafar, FnT 11] Unlike previously believed DoF(H) almost surely for all H is K/2 and not 1! - central tool: number theoretic Diophantine Approximation Theory Shlomo Shamai (Technion) Stanford, 2013 29 / 38

Interference Alignment Approach MMSE Dimension An Information/MMSE Dimension Approach [Wu-Shamai-Verdú, ISIT 11]: Non Single Letter Capacity Region [Ahlswede, ISIT 71] 1 C(H, snr) = lim n n C(H, snr) = lim sup P(x n 1,... xn K ) sup n P(x1 n,... xn K ) K i=1 I (X n i ; Y n i ) { R i I (X n i ; Y n i ), i = 1, 2... K I (Xi n ; Yi n ) = I (X1 n, X2 n... XK n ; Y i n ) I (X1 n,..., XK n ; Y i n Xi n ) single-letterized expressions! K K K dof (X (K) : H) d h ij X j d h ij X j DoF (H) = sup i=1 j=1 dof (X (K) : H) P(X (K) ) (independent X 1 X 2...X K ) j i } Shlomo Shamai (Technion) Stanford, 2013 30 / 38

Interference Alignment Approach MMSE Dimension Information/MMSE Dimension-Results - DoF (H) is invariant under row or column scaling - Removing cross-links does not decrease DoF - Suboptimality of discrete-continuous mixtures! to get DoF > 1, necessary to emply singular components MMSE dimension oscillates periodically in snr (db) around information dimension - If off-diagonal entries H are rational and diagonal entries irrational DoF (H) = K/2 (no need for irrational algebraic numbers [Etkin-Ordentlich, IT 09]) 1 0 0 - Example: H = 1+ 1 2 0, DoF(H) 1 + log 5 6 2 > 2+log 2 3 6 1 1 1 [Etkin-Ordentlich, IT 09]. - DoF Region: D(H) co{e 1,... e K, 1 2I }, fully connected H Shlomo Shamai (Technion) Stanford, 2013 31 / 38

Interference Alignment Approach MMSE Dimension Information/MMSE Dimension-Results - DoF (H) is invariant under row or column scaling - Removing cross-links does not decrease DoF - Suboptimality of discrete-continuous mixtures! to get DoF > 1, necessary to emply singular components MMSE dimension oscillates periodically in snr (db) around information dimension - If off-diagonal entries H are rational and diagonal entries irrational DoF (H) = K/2 (no need for irrational algebraic numbers [Etkin-Ordentlich, IT 09]) 1 0 0 - Example: H = 1+ 1 2 0, DoF(H) 1 + log 5 6 2 > 2+log 2 3 6 1 1 1 [Etkin-Ordentlich, IT 09]. - DoF Region: D(H) co{e 1,... e K, 1 2I }, fully connected H Shlomo Shamai (Technion) Stanford, 2013 31 / 38

Interference Alignment Approach MMSE Dimension Information/MMSE Dimension-Results - DoF (H) is invariant under row or column scaling - Removing cross-links does not decrease DoF - Suboptimality of discrete-continuous mixtures! to get DoF > 1, necessary to emply singular components MMSE dimension oscillates periodically in snr (db) around information dimension - If off-diagonal entries H are rational and diagonal entries irrational DoF (H) = K/2 (no need for irrational algebraic numbers [Etkin-Ordentlich, IT 09]) 1 0 0 - Example: H = 1+ 1 2 0, DoF(H) 1 + log 5 6 2 > 2+log 2 3 6 1 1 1 [Etkin-Ordentlich, IT 09]. - DoF Region: D(H) co{e 1,... e K, 1 2I }, fully connected H Shlomo Shamai (Technion) Stanford, 2013 31 / 38

Interference Alignment Approach MMSE Dimension Information/MMSE Dimension-Results - DoF (H) is invariant under row or column scaling - Removing cross-links does not decrease DoF - Suboptimality of discrete-continuous mixtures! to get DoF > 1, necessary to emply singular components MMSE dimension oscillates periodically in snr (db) around information dimension - If off-diagonal entries H are rational and diagonal entries irrational DoF (H) = K/2 (no need for irrational algebraic numbers [Etkin-Ordentlich, IT 09]) 1 0 0 - Example: H = 1+ 1 2 0, DoF(H) 1 + log 5 6 2 > 2+log 2 3 6 1 1 1 [Etkin-Ordentlich, IT 09]. - DoF Region: D(H) co{e 1,... e K, 1 2I }, fully connected H Shlomo Shamai (Technion) Stanford, 2013 31 / 38

Interference Alignment Approach MMSE Dimension Information/MMSE Dimension-Results - DoF (H) is invariant under row or column scaling - Removing cross-links does not decrease DoF - Suboptimality of discrete-continuous mixtures! to get DoF > 1, necessary to emply singular components MMSE dimension oscillates periodically in snr (db) around information dimension - If off-diagonal entries H are rational and diagonal entries irrational DoF (H) = K/2 (no need for irrational algebraic numbers [Etkin-Ordentlich, IT 09]) 1 0 0 - Example: H = 1+ 1 2 0, DoF(H) 1 + log 5 6 2 > 2+log 2 3 6 1 1 1 [Etkin-Ordentlich, IT 09]. - DoF Region: D(H) co{e 1,... e K, 1 2I }, fully connected H Shlomo Shamai (Technion) Stanford, 2013 31 / 38

Interference Alignment Approach MMSE Dimension Information/MMSE Dimension-Results - DoF (H) is invariant under row or column scaling - Removing cross-links does not decrease DoF - Suboptimality of discrete-continuous mixtures! to get DoF > 1, necessary to emply singular components MMSE dimension oscillates periodically in snr (db) around information dimension - If off-diagonal entries H are rational and diagonal entries irrational DoF (H) = K/2 (no need for irrational algebraic numbers [Etkin-Ordentlich, IT 09]) 1 0 0 - Example: H = 1+ 1 2 0, DoF(H) 1 + log 5 6 2 > 2+log 2 3 6 1 1 1 [Etkin-Ordentlich, IT 09]. - DoF Region: D(H) co{e 1,... e K, 1 2I }, fully connected H Shlomo Shamai (Technion) Stanford, 2013 31 / 38

Interference Alignment Approach MMSE Dimension Information/MMSE Dimension-Results - DoF (H) is invariant under row or column scaling - Removing cross-links does not decrease DoF - Suboptimality of discrete-continuous mixtures! to get DoF > 1, necessary to emply singular components MMSE dimension oscillates periodically in snr (db) around information dimension - If off-diagonal entries H are rational and diagonal entries irrational DoF (H) = K/2 (no need for irrational algebraic numbers [Etkin-Ordentlich, IT 09]) 1 0 0 - Example: H = 1+ 1 2 0, DoF(H) 1 + log 5 6 2 > 2+log 2 3 6 1 1 1 [Etkin-Ordentlich, IT 09]. - DoF Region: D(H) co{e 1,... e K, 1 2I }, fully connected H Shlomo Shamai (Technion) Stanford, 2013 31 / 38

Outlook Finite n MMSE Constraint Outlook: Finite n MMSE Constraint Assume 0 < snr 1 < snr 2. max I n (snr 2 ) s.t. MMSE cn (snr 1 ) β 1 + βsnr 1 where the maximization is over P(X), complying with the power constraint. X - length n random vector Conjecture for n = 1: For β < 1 discrete optimizing measure P(x). Shlomo Shamai (Technion) Stanford, 2013 32 / 38

Outlook The Mutual Information Disturbance Measure Mutual information disturbance: single constraint Bandemer and El Gamal, 2011, measure the disturbance at the unintended receiver using the mutual information at Z. That is, assuming this mutual information is at most R d what is the maximum possible rate to the intended receiver, Y. Theorem (Bandemer and El Gamal, ISIT 11) Assuming snr 1 < snr 2 the solution of the following optimization problem, max I n(snr 2) for some α [0, 1], is the following s.t. I n(snr 1) 1 2 log (1 + α snr 1) I n(snr 2) = 1 2 log (1 + α snr 2). Equality is attained, for any n, by choosing X Gaussian with i.i.d. components of variance α. For n equality is also attained by a Gaussian codebook designed for snr 2 with limited power of α. Shlomo Shamai (Technion) Stanford, 2013 33 / 38

Outlook The Mutual Information Disturbance Measure Mutual information disturbance: single constraint Bandemer and El Gamal, 2011, measure the disturbance at the unintended receiver using the mutual information at Z. That is, assuming this mutual information is at most R d what is the maximum possible rate to the intended receiver, Y. Theorem (Bandemer and El Gamal, ISIT 11) Assuming snr 1 < snr 2 the solution of the following optimization problem, max I n(snr 2) for some α [0, 1], is the following s.t. I n(snr 1) 1 2 log (1 + α snr 1) I n(snr 2) = 1 2 log (1 + α snr 2). Equality is attained, for any n, by choosing X Gaussian with i.i.d. components of variance α. For n equality is also attained by a Gaussian codebook designed for snr 2 with limited power of α. Shlomo Shamai (Technion) Stanford, 2013 33 / 38

Outlook The Mutual Information Disturbance Measure Simple I-MMSE proof Since, 0 I n (snr 1 ) 1 2 log (1 + snr 1) there exists an α [0, 1] such that I n (snr 1 ) = 1 n log (1 + α snr 1 ). = MMSE cn (γ) and mmse G (γ) of X G N (0, α ) cross in [0, snr 1 ]. Using the I-MMSE I n (snr 2 ) = 1 2 log (1 + α snr 1 ) + 1 2 1 2 log (1 + α snr 2 ) snr2 snr 1 MMSE cn (γ)dγ due to the single crossing point property which ensures MMSE cn (γ) mmse G (γ), γ [snr 1, ) Shlomo Shamai (Technion) Stanford, 2013 34 / 38

Outlook The Mutual Information Disturbance Measure Simple I-MMSE proof Since, 0 I n (snr 1 ) 1 2 log (1 + snr 1) there exists an α [0, 1] such that I n (snr 1 ) = 1 n log (1 + α snr 1 ). = MMSE cn (γ) and mmse G (γ) of X G N (0, α ) cross in [0, snr 1 ]. Using the I-MMSE I n (snr 2 ) = 1 2 log (1 + α snr 1 ) + 1 2 1 2 log (1 + α snr 2 ) snr2 snr 1 MMSE cn (γ)dγ due to the single crossing point property which ensures MMSE cn (γ) mmse G (γ), γ [snr 1, ) Shlomo Shamai (Technion) Stanford, 2013 34 / 38

Outlook The Mutual Information Disturbance Measure K Mutual Information Constraints Theorem Assuming snr 1 < snr 2 <... < snr K the solution of max I n(snr K ) s.t. i {1,..., K 1}, I n(snr i ) 1 2 log (1 + α isnr i ) for some α i [0, 1], is the following I n(snr K ) = 1 2 log (1 + α lsnr K ) where α l, l {1,..., K 1}, is defined such that i {1,..., K 1} 1 2 log (1 + α lsnr i ) 1 2 log (1 + α isnr i ) The maximum rate is attained, for any n, by choosing X Gaussian with i.i.d. components of variance α l. For n equality is also attained by a Gaussian codebook designed for snr K with limited power of α l. Shlomo Shamai (Technion) Stanford, 2013 35 / 38

Outlook Gaussian MIMO Channels Gaussian MIMO Channels Measuring disturbance by MMSE provides some engineering insights into the relative efficiency of rate splitting (Han-Kobayashi) for simple scalar Gaussian interference channels. This is not the case, when disturbance is measured by mutual information. Bandemer and El Gamal have extended their solution to the Gaussian MIMO channel under a mutual information constraint <arxiv:1103.0996v2>, and this solution implies rate-splitting. Challenge: extend the equivalent MMSE constraint to the Gaussian MIMO setting. Can I-MMSE considerations reflect the best achievable performance in all snr regimes? DoF via information (MMSE) dimensions for the vector channel [Stotz-Bölcskei, 2012]. Challenge: Optimal rate-mmse disturbance constraints, tradeoffs for single codes (not allowing rate splitting) and practical implications (belief propagation soft decoding for LDPC codes). Shlomo Shamai (Technion) Stanford, 2013 36 / 38

Outlook Gaussian MIMO Channels Gaussian MIMO Channels Measuring disturbance by MMSE provides some engineering insights into the relative efficiency of rate splitting (Han-Kobayashi) for simple scalar Gaussian interference channels. This is not the case, when disturbance is measured by mutual information. Bandemer and El Gamal have extended their solution to the Gaussian MIMO channel under a mutual information constraint <arxiv:1103.0996v2>, and this solution implies rate-splitting. Challenge: extend the equivalent MMSE constraint to the Gaussian MIMO setting. Can I-MMSE considerations reflect the best achievable performance in all snr regimes? DoF via information (MMSE) dimensions for the vector channel [Stotz-Bölcskei, 2012]. Challenge: Optimal rate-mmse disturbance constraints, tradeoffs for single codes (not allowing rate splitting) and practical implications (belief propagation soft decoding for LDPC codes). Shlomo Shamai (Technion) Stanford, 2013 36 / 38

Outlook Gaussian MIMO Channels Gaussian MIMO Channels Measuring disturbance by MMSE provides some engineering insights into the relative efficiency of rate splitting (Han-Kobayashi) for simple scalar Gaussian interference channels. This is not the case, when disturbance is measured by mutual information. Bandemer and El Gamal have extended their solution to the Gaussian MIMO channel under a mutual information constraint <arxiv:1103.0996v2>, and this solution implies rate-splitting. Challenge: extend the equivalent MMSE constraint to the Gaussian MIMO setting. Can I-MMSE considerations reflect the best achievable performance in all snr regimes? DoF via information (MMSE) dimensions for the vector channel [Stotz-Bölcskei, 2012]. Challenge: Optimal rate-mmse disturbance constraints, tradeoffs for single codes (not allowing rate splitting) and practical implications (belief propagation soft decoding for LDPC codes). Shlomo Shamai (Technion) Stanford, 2013 36 / 38

Outlook Gaussian MIMO Channels Thank You! Shlomo Shamai (Technion) Stanford, 2013 37 / 38

Abstract Shlomo Shamai, Dept. EE, Technion. Stanford, Palo-Alto, CA, USA, 7 February, 2013 Coding over Interference Channels: An Information-Estimation View The information-estimation relation is used to gain insight into useful coding schemes operating over the Gaussian interference channel. After reviewing basic I-MMSE relations and their implications on point-to-point coding over the Gaussian channel, we focus on the Gaussian interference channel. Here the inflicted interference is measured by the associated minimum mean square error (MMSE). Structure of codes achieving reliable communication at some specific signal-to-noise ratio (SNR) and constrained by the permitted MMSE at a lower SNR values, modeling the interference, are discussed. It is shown that layered superposition codes attain optimal performance, providing thus some engineering insight to the relative efficiency of the Han-Kobayashi coding strategy. The Degrees-of-Freedom (DoF) behavior of the multi-user Gaussian interference channel is captured by considering the MMSE-Dimension concept, providing a general expression for the DoF. A short outlook concludes the presentation, addressing related research challenges, and also recent results, where interference is measured by the corresponding mutual information. - Joint work with Ronit Bustin, Technion Shlomo Shamai (Technion) Stanford, 2013 38 / 38