On Source-Channel Communication in Networks

Similar documents
Uncoded transmission is exactly optimal for a simple Gaussian sensor network

Reliable Computation over Multiple-Access Channels

On Capacity Under Received-Signal Constraints

Joint Source-Channel Coding for the Multiple-Access Relay Channel

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

The Distributed Karhunen-Loève Transform

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

Energy Efficient Estimation of Gaussian Sources Over Inhomogeneous Gaussian MAC Channels

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER

The Sensor Reachback Problem

On the Capacity of the Interference Channel with a Relay

Variable Length Codes for Degraded Broadcast Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Asymptotic Distortion Performance of Source-Channel Diversity Schemes over Relay Channels

(Classical) Information Theory III: Noisy channel coding

Cooperative Communication with Feedback via Stochastic Approximation

A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels

Shannon s noisy-channel theorem

Lecture 2: August 31

On the Duality between Multiple-Access Codes and Computation Codes

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN

MMSE DECODING FOR ANALOG JOINT SOURCE CHANNEL CODING USING MONTE CARLO IMPORTANCE SAMPLING

Interactive Decoding of a Broadcast Message

Rematch and Forward: Joint Source-Channel Coding for Communications

Information Theory Meets Game Theory on The Interference Channel

On Common Information and the Encoding of Sources that are Not Successively Refinable

Error Exponent Region for Gaussian Broadcast Channels

ECE Information theory Final (Fall 2008)

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

THE multiple-access relay channel (MARC) is a multiuser

A Comparison of Two Achievable Rate Regions for the Interference Channel

Half-Duplex Gaussian Relay Networks with Interference Processing Relays

Polar codes for the m-user MAC and matroids

Revision of Lecture 5

6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Interference Channel aided by an Infrastructure Relay

An Outer Bound for the Gaussian. Interference channel with a relay.

On the Capacity of the Binary-Symmetric Parallel-Relay Network

On the Capacity Region of the Gaussian Z-channel

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

Source and Channel Coding for Correlated Sources Over Multiuser Channels

Lecture 11: Quantum Information III - Source Coding

(Classical) Information Theory II: Source coding

II. THE TWO-WAY TWO-RELAY CHANNEL

Coding into a source: an inverse rate-distortion theorem

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

Multiuser Successive Refinement and Multiple Description Coding

Lecture 4 Noisy Channel Coding

Chapter 9 Fundamental Limits in Information Theory

Lossy Distributed Source Coding

The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel

On the Energy-Distortion Tradeoff of Gaussian Broadcast Channels with Feedback

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Hypothesis Testing with Communication Constraints

An Achievable Rate for the Multiple Level Relay Channel

Cut-Set Bound and Dependence Balance Bound

Random Access: An Information-Theoretic Perspective

Energy State Amplification in an Energy Harvesting Communication System

Lecture 10: Broadcast Channel and Superposition Coding

Equivalence for Networks with Adversarial State

A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality

Optimal Power Control in Decentralized Gaussian Multiple Access Channels

A Comparison of Superposition Coding Schemes

Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback

Mixed Noisy Network Coding

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

The Gallager Converse

A tool oriented approach to network capacity. Ralf Koetter Michelle Effros Muriel Medard

Subset Universal Lossy Compression

Upper Bounds on the Capacity of Binary Intermittent Communication

Entropies & Information Theory

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Interference Channels with Source Cooperation

arxiv: v1 [cs.it] 4 Jun 2018

The Method of Types and Its Application to Information Hiding

Side Information Aware Coding Strategies for Estimation under Communication Constraints

On the Duality of Gaussian Multiple-Access and Broadcast Channels

On Side-Informed Coding of Noisy Sensor Observations

Multiple-Input Multiple-Output Systems

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

Source-Channel Coding Techniques in the Presence of Interference and Noise

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

On Multiple User Channels with State Information at the Transmitters

On the Power Allocation for Hybrid DF and CF Protocol with Auxiliary Parameter in Fading Relay Channels

Communication Games on the Generalized Gaussian Relay Channel

ECE Information theory Final

The Poisson Channel with Side Information

Selective Coding Strategy for Unicast Composite Networks

Source-Channel Coding Theorems for the Multiple-Access Relay Channel

Compressed Sensing Using Bernoulli Measurement Matrices

RECENT advances in technology have led to increased activity

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding

The Role of Directed Information in Network Capacity

Transcription:

On Source-Channel Communication in Networks Michael Gastpar Department of EECS University of California, Berkeley gastpar@eecs.berkeley.edu DIMACS: March 17, 2003.

Outline 1. Source-Channel Communication seen from the perspective of the separation theorem 2. Source-Channel Communication seen from the perspective of measure-matching Acknowledgments Gerhard Kramer Bixio Rimoldi Emre Telatar Martin Vetterli

Source-Channel Communication Consider the transmission of a discrete-time memoryless source across a discrete-time memoryless channel. Source S F X Channel Y G Ŝ Destination F (S n ) = X n G(Y n ) = Ŝn The fundamental trade-off is cost versus distortion, = Ed(S n, Ŝn ) Γ = Eρ(X n ) What is the set of achievable trade-offs (Γ, )? optimal trade-offs (Γ, )?

The Separation Theorem Source S F X Channel Y G Ŝ Destination F (S n ) = X n G(Y n ) = Ŝn For a fixed source (p S, d) and a fixed channel (p Y X, ρ): A cost-distortion pair (Γ, ) is achievable if and only if R( ) C(Γ). A cost-distortion pair (Γ, ) is optimal if and only if R( ) = C(Γ), subject to certain technicalities. Rate-matching: In an optimal communication system, the minimum source rate is matched (i.e., equal) to the maximum channel rate.

Source-Channel Communication in Networks Simple source-channel network: Src 1 S 1 X 1 F 1 Y 1 Ŝ G 11 1 Dest 1 Channel Src 2 S 2 X 2 F 2 Y 2 Ŝ 12, G Ŝ22 12 Dest 2 Trade-off between cost (Γ 1, Γ 2 ) and distortion ( 11, 12, 22 ). Achievable cost-distortion tuples? Optimal cost-distortion tuples? For the sketched topology, the (full) answer is unknown.

These Trade-offs Are Achievable: For a fixed network topology and fixed probability distributions and cost/distortion functions: If a cost-distortion tuple satisfies R( 1, 2,...) C(Γ 1, Γ 2,...), then it is achievable. R 2 R C R 1 When is it optimal?

Example: Multi-access source-channel communication Src 1 Src 2 S 1 X F 1 {0, 1} 1 Ŝ 1, Ŝ2 Y = X 1 + X 2 G 12 S 2 F 2 X 2 {0, 1} Dest Capacity region of this channel is contained inside R 1 + R 2 1.5. Goal: Reconstruct S 1 and S 2 perfectly. S 1 and S 2 are correlated: S 1 = 0 S 1 = 1 S 2 = 0 1/3 1/3 S 2 = 1 0 1/3 R 1 + R 2 log 2 3 1.585. R and C do not intersect. Yet uncoded transmission works. This example appears in T. M. Cover, A. El Gamal, M. Salehi, Multiple access channels with arbitrarily correlated sources. IEEE Trans IT-26, 1980.

So what is capacity? The capacity region is computed assuming independent messages. In a source-channel context, the underlying sources may be dependent. MAC example: Allowing arbitrary dependence of the channel inputs, the capacity is log 2 3 = 1.585, fixing the example: R C =. Can we simply redefine capacity appropriately? Remark: Multi-access with dependent messages is still an open problem. T. M. Cover, A. El Gamal, M. Salehi, Multiple access channels with arbitrarily correlated sources. IEEE Trans IT-26, 1980.

Separation Strategies for Networks In order to retain a notion of capacity: Discrete messages are transmitted reliably Src 1 S 1 F 1 F 1 X 1 Y 1 G 1 G 1 Ŝ 11 Dest 1 Channel Src 2 S 2 F 2 F 2 X 2 Y 2 G 12 G 12 Ŝ 12, Ŝ22 Dest 2 What is the best achievable performance for such a system? The general answer is unknown.

Example: Broadcast Z 1 Y 1 G Ŝ 1 1 Dest 1 Src S F X Z 2 Y 2 G Ŝ 2 2 Dest 2 S, Z 1, Z 2 are i.i.d. Gaussian. Goal: Minimize the meansquared errors 1 and 2. Denote by 1 and 2 the singleuser minima. 1 and 2 cannot be achieved simultaneously by sending messages reliably: The messages disturb one another. But uncoded transmission achieves 1 and 2 simultaneously. This cannot be fixed by altering the definitions of capacity and/or rate-distortion regions.

Alternative approach Eρ(X n ) Γ Source S F X Channel Y G Ŝ Destination F (S n ) = X n G(Y n ) = Ŝn A code (F, G) performs optimally if and only if it satisfies R( ) = C(Γ) (subject to certain technical conditions). Equivalently, a code (F, G) performs optimally if and only if ρ(x n ) = c 1 D(p Y n x n p Y n ) + ρ 0 d(s n, ŝ n ) = c 2 log 2 p(s n ŝ n ) + d 0 (s) I(S n ; Ŝn ) = I(X n ; Y n ) We call this the measure-matching conditions.

Single-source Broadcast Src S F Y 1 G Ŝ 1 1 Dest 1 X Chan Y 2 G Ŝ 2 2 Dest 2 Measure-matching conditions for single-source broadcast: If the single-source broadcast communication system satisfies ρ(x) = c (1) 1 D(p Y 1 x p Y1 ) + ρ (1) 0 = c (2) 1 D(p Y 2 x p Y2 ) + ρ (2) 0, d 1 (s, ŝ 1 ) = c (1) 2 log 2 p(s ŝ 1 ) + d (1) 0 (s), d 2 (s, ŝ 2 ) = c (2) 2 log 2 p(s ŝ 2 ) + d (2) 0 (s), I(X; Y 1 ) = I(S; Ŝ1), and I(X; Y 2 ) = I(S; Ŝ2), then it performs optimally.

Sensor Network M wireless sensors measure physical phenomena characterized by S. U 1 F 1 X 1 U 2 F 2 X 2 Source S Y G Ŝ Dest U M F M X M

Gaussian Sensor Network The observations U 1, U 2,..., U k are noisy versions of S. W 1 W 2 U 1 U 2 F 1 F 2 X 1 X 2 M k=1 E X k 2 MP Z Source S Y G Ŝ Dest W M U M F M X M

Gaussian Sensor Network: Bits Consider the following communication strategy: W 1 U 1 F 1 Bits F 1 M k=1 E X k 2 MP W 2 U 2 F 2 Bits F 2 X 1 X 2 Z Source S Y G Bits G Ŝ Dest W M U M F M Bits F M X M

Gaussian Sensor Network: Bits (1/2) Source coding part. CEO problem. See Berger, Zhang, Viswanathan (1996); Viswanathan and Berger (1997); Oohama (1998). W 1 W 2 U 1 U 2 F 1 F 2 T 1 T 2 S N c (0, σ 2 S) and for k = 1,..., M, W k N c (0, σ 2 W) Src S G Ŝ Dest For large R tot, the behavior is W M U M F M T M D CEO (R tot ) = σ2 W R tot.

Gaussian Sensor Network: Bits (2/2) Channel coding part. Additive white Gaussian multi-access channel: ( R sum log 2 1 + MP ) σz 2. However, the codewords may be dependent. Therefore, the sum rate may be up to ( R sum log 2 1 + M 2 ) P σz 2. Hence, the distortion for a system that satisfies the rate-matching condition is at least Is this optimal? D rm (M) σ 2 W log 2 (1 + M 2 P σ 2 Z )

Gaussian Sensor Network: Uncoded transmission Consider instead the following coding strategy: W 1 U 1 α 1 X 1 M k=1 E X k 2 MP W 2 U 2 α 2 X 2 Z Source S Y G Ŝ Dest W M U M α M X M

Gaussian Sensor Network: Uncoded transmission Strategy: The sensors transmit whatever they measure, scaled to their power constraint, without any coding at all. Y [n] = P σ 2 S + σ2 W ( MS[n] + ) M W k [n] + Z[n]. If the decoder is the minimum mean-squared error estimate of S based on Y, the following distortion is incurred: k=1 Proposition 1. Uncoded transmission achieves D 1 (MP ) = σs 2σ2 W M 2 M+(σZ 2 /σ2 W )(σ2 S +σ2 W )/P σ2 S + σ2 W. This is better than separation (D rm 1/ log M). In this sense, uncoded transmission beats capacity. Is it optimal?

Gaussian Sensor Network: An outer bound Suppose the decoder has direct access to U 1, U 2,..., U M. W 1 W 2 U 1 U 2 The smallest distortion for our sensor network cannot be smaller than the smallest distortion for the idealization. Src S G Ŝ Dest D min,ideal = σ 2 S σ2 W Mσ 2 S + σ2 W W M U M

Gaussian Sensor Network: Asymptotic optimum Rate-matching: D rm (MP ) σ 2 W log 2 (1 + M 2 P σ 2 Z ) Uncoded transmission: D 1 (MP ) = σs 2σ2 W M 2 M+(σZ 2 /σ2 W )(σ2 S +σ2 W )/P σ2 S + σ2 W. Proposition 2. As the number of sensors becomes large, the optimum trade-off is D(MP ) σ 2 S σ2 W MσS 2 +. σ2 W

Gaussian Sensor Network: Conclusions Two conclusions from the Gaussian sensor network example: 1. Uncoded transmission is asymptotically optimal. This leads to a general measure-matching condition. 2. Even for finite M, uncoded transmission considerably outperforms the best separation-based coding strategies. This suggests an alternative coding paradigm for source-channel networks.

Sensor Network: Measure-matching Theorem. If the coding system F 1, F 2,..., F M, G satisfies the cost constraint Eρ(X 1, X 2,..., X M ) Γ, and then it performs optimally. d(s, ŝ) = log 2 p(s ŝ) I(S; U 1 U 2... U M ) = I(S; Ŝ), U 1 F 1 X 1 U 2 F 2 X 2 Src S Chan Y G Ŝ Dest U M F M X M

Proof: Cut-sets Outer bound on the capacity region of a network: X 1. X M S S c Y If the rates (R 1, R 2,..., R M ) are achievable, they must satisfy, for every cut S: R k max I(X S; Y S c X S c) S S c p(x 1,x 2,...,x M ) Hence, if a scheme satisfies, for some cut S, the above with equality, then it is optimal (with respect to S). Remark. This can be sharpened. If the rates (R 1, R 2,..., R M ) are achievable, then there exists some joint probability distribution p(x 1, x 2,..., x M ) such that for every cut S: S S c R k I(X S ; Y S c X S c)

Source-channel Cut-sets (1/2) Fix the coding scheme (F 1, F 2,..., F M, G). Is it optimal? Place any source-channel cut through the source-channel network. S U 1. U M X 1. X M Y Ŝ Sufficient condition for optimality: R S ( ) = C (X1,X 2,...,X M ) Y (Γ). Gaussian: D σ2 S σ2 Z M 2 P + σz 2. Equivalently, using measure-matching conditions, ρ(x 1, x 2,..., x M ) = D(p Y x1,x 2...,x M p Y ) d(s, ŝ) = log 2 p(s ŝ) I(S; Ŝ) = I(X 1X 2... X M ; Y )

Source-channel Cut-sets (2/2) Fix the coding scheme (F 1, F 2,..., F M, G). Is it optimal? Place any source-channel cut through the source-channel network. S U 1. U M X 1. X M Y Ŝ Sufficient condition for optimality: R S ( ) = C S (U1,U 2,...,U M )(Γ). Gaussian: D σ2 S σ2 W MσS 2 +. σ2 W Equivalently, using measure-matching conditions, ρ(s) = D(p U1,U 2...,U M s p U1,U 2...,U M ) d(s, ŝ) = log 2 p(s ŝ) I(S; Ŝ) = I(S; U 1U 2... U M )

Sensor Network: Measure-matching Theorem. If the coding system F 1, F 2,..., F M, G satisfies the cost constraint Eρ(X 1, X 2,..., X M ) Γ, and then it performs optimally. d(s, ŝ) = log 2 p(s ŝ) I(S; U 1 U 2... U M ) = I(S; Ŝ), U 1 F 1 X 1 U 2 F 2 X 2 Src S Chan Y G Ŝ Dest U M F M X M

Gaussian Example 1. The uncoded scheme satisfies the condition d(s, ŝ) = log 2 p(s ŝ) for any M since p(s ŝ) is Gaussian. More generally, this is true as soon as the sum of the measurement noises W k, k = 1,..., M, is Gaussian. 2. For the mutual information, for large M, I(S; U 1 U 2... U M ) I(S; Ŝ) c 1 log 2 ( 1 + c 2 M 2 ) M, hence the second measure-matching condition is approached as M.

Measure-matching as a coding paradigm Second observation from the Gaussian sensor network example: 2. Even for finite M, uncoded transmission considerably outperforms the best separation-based coding strategies. Coding Paradigm. The goal of the coding scheme in the sensor network topology is to approach as closely as possible. d(s, ŝ) = log 2 p(s ŝ) I(S; U 1 U 2... U M ) = I(S; Ŝ), The precise meaning of as closely as possible remains to be determined.

Slightly Extended Topologies Communication between the sensors S U 1. U M X 1. X M Y Ŝ Sensors assisted by relays S U 1. U M X 1 R 1. X M Y Ŝ

Slightly Extended Topologies Key insight: The same outer bound applies. Hence, But: the same measure-matching condition applies, and in the Gaussian scenario, uncoded transmission, ignoring the communication between the sensors, and/or the relay, is asymptotically optimal. Communication between the sensors simplifies the task of matching the measures. Relays simplify the task of matching the measures. Can this be quantified?

Conclusions Rate-matching: Yields some achievable cost-distortion pairs for arbitrary network topologies. Measure-matching: Yields some optimal cost-distortion pairs for certain network topologies, including single-source broadcast sensor network sensor network with communication between the sensors sensor network with relays

What is Information? Point-to-point: Network: Information = Bits Information =??? References: 1. M. Gastpar and M. Vetterli, Source-channel communication in sensor networks, IPSN 2003 and Springer Lecture Notes in Computer Science, April 2003. 2. M. Gastpar, Cut-set bounds for source-channel networks, in preparation.