On Source-Channel Communication in Networks
|
|
- Branden Foster
- 6 years ago
- Views:
Transcription
1 On Source-Channel Communication in Networks Michael Gastpar Department of EECS University of California, Berkeley DIMACS: March 17, 2003.
2 Outline 1. Source-Channel Communication seen from the perspective of the separation theorem 2. Source-Channel Communication seen from the perspective of measure-matching Acknowledgments Gerhard Kramer Bixio Rimoldi Emre Telatar Martin Vetterli
3 Source-Channel Communication Consider the transmission of a discrete-time memoryless source across a discrete-time memoryless channel. Source S F X Channel Y G Ŝ Destination F (S n ) = X n G(Y n ) = Ŝn The fundamental trade-off is cost versus distortion, = Ed(S n, Ŝn ) Γ = Eρ(X n ) What is the set of achievable trade-offs (Γ, )? optimal trade-offs (Γ, )?
4 The Separation Theorem Source S F X Channel Y G Ŝ Destination F (S n ) = X n G(Y n ) = Ŝn For a fixed source (p S, d) and a fixed channel (p Y X, ρ): A cost-distortion pair (Γ, ) is achievable if and only if R( ) C(Γ). A cost-distortion pair (Γ, ) is optimal if and only if R( ) = C(Γ), subject to certain technicalities. Rate-matching: In an optimal communication system, the minimum source rate is matched (i.e., equal) to the maximum channel rate.
5 Source-Channel Communication in Networks Simple source-channel network: Src 1 S 1 X 1 F 1 Y 1 Ŝ G 11 1 Dest 1 Channel Src 2 S 2 X 2 F 2 Y 2 Ŝ 12, G Ŝ22 12 Dest 2 Trade-off between cost (Γ 1, Γ 2 ) and distortion ( 11, 12, 22 ). Achievable cost-distortion tuples? Optimal cost-distortion tuples? For the sketched topology, the (full) answer is unknown.
6 These Trade-offs Are Achievable: For a fixed network topology and fixed probability distributions and cost/distortion functions: If a cost-distortion tuple satisfies R( 1, 2,...) C(Γ 1, Γ 2,...), then it is achievable. R 2 R C R 1 When is it optimal?
7 Example: Multi-access source-channel communication Src 1 Src 2 S 1 X F 1 {0, 1} 1 Ŝ 1, Ŝ2 Y = X 1 + X 2 G 12 S 2 F 2 X 2 {0, 1} Dest Capacity region of this channel is contained inside R 1 + R Goal: Reconstruct S 1 and S 2 perfectly. S 1 and S 2 are correlated: S 1 = 0 S 1 = 1 S 2 = 0 1/3 1/3 S 2 = 1 0 1/3 R 1 + R 2 log R and C do not intersect. Yet uncoded transmission works. This example appears in T. M. Cover, A. El Gamal, M. Salehi, Multiple access channels with arbitrarily correlated sources. IEEE Trans IT-26, 1980.
8 So what is capacity? The capacity region is computed assuming independent messages. In a source-channel context, the underlying sources may be dependent. MAC example: Allowing arbitrary dependence of the channel inputs, the capacity is log 2 3 = 1.585, fixing the example: R C =. Can we simply redefine capacity appropriately? Remark: Multi-access with dependent messages is still an open problem. T. M. Cover, A. El Gamal, M. Salehi, Multiple access channels with arbitrarily correlated sources. IEEE Trans IT-26, 1980.
9 Separation Strategies for Networks In order to retain a notion of capacity: Discrete messages are transmitted reliably Src 1 S 1 F 1 F 1 X 1 Y 1 G 1 G 1 Ŝ 11 Dest 1 Channel Src 2 S 2 F 2 F 2 X 2 Y 2 G 12 G 12 Ŝ 12, Ŝ22 Dest 2 What is the best achievable performance for such a system? The general answer is unknown.
10 Example: Broadcast Z 1 Y 1 G Ŝ 1 1 Dest 1 Src S F X Z 2 Y 2 G Ŝ 2 2 Dest 2 S, Z 1, Z 2 are i.i.d. Gaussian. Goal: Minimize the meansquared errors 1 and 2. Denote by 1 and 2 the singleuser minima. 1 and 2 cannot be achieved simultaneously by sending messages reliably: The messages disturb one another. But uncoded transmission achieves 1 and 2 simultaneously. This cannot be fixed by altering the definitions of capacity and/or rate-distortion regions.
11 Alternative approach Eρ(X n ) Γ Source S F X Channel Y G Ŝ Destination F (S n ) = X n G(Y n ) = Ŝn A code (F, G) performs optimally if and only if it satisfies R( ) = C(Γ) (subject to certain technical conditions). Equivalently, a code (F, G) performs optimally if and only if ρ(x n ) = c 1 D(p Y n x n p Y n ) + ρ 0 d(s n, ŝ n ) = c 2 log 2 p(s n ŝ n ) + d 0 (s) I(S n ; Ŝn ) = I(X n ; Y n ) We call this the measure-matching conditions.
12 Single-source Broadcast Src S F Y 1 G Ŝ 1 1 Dest 1 X Chan Y 2 G Ŝ 2 2 Dest 2 Measure-matching conditions for single-source broadcast: If the single-source broadcast communication system satisfies ρ(x) = c (1) 1 D(p Y 1 x p Y1 ) + ρ (1) 0 = c (2) 1 D(p Y 2 x p Y2 ) + ρ (2) 0, d 1 (s, ŝ 1 ) = c (1) 2 log 2 p(s ŝ 1 ) + d (1) 0 (s), d 2 (s, ŝ 2 ) = c (2) 2 log 2 p(s ŝ 2 ) + d (2) 0 (s), I(X; Y 1 ) = I(S; Ŝ1), and I(X; Y 2 ) = I(S; Ŝ2), then it performs optimally.
13 Sensor Network M wireless sensors measure physical phenomena characterized by S. U 1 F 1 X 1 U 2 F 2 X 2 Source S Y G Ŝ Dest U M F M X M
14 Gaussian Sensor Network The observations U 1, U 2,..., U k are noisy versions of S. W 1 W 2 U 1 U 2 F 1 F 2 X 1 X 2 M k=1 E X k 2 MP Z Source S Y G Ŝ Dest W M U M F M X M
15 Gaussian Sensor Network: Bits Consider the following communication strategy: W 1 U 1 F 1 Bits F 1 M k=1 E X k 2 MP W 2 U 2 F 2 Bits F 2 X 1 X 2 Z Source S Y G Bits G Ŝ Dest W M U M F M Bits F M X M
16 Gaussian Sensor Network: Bits (1/2) Source coding part. CEO problem. See Berger, Zhang, Viswanathan (1996); Viswanathan and Berger (1997); Oohama (1998). W 1 W 2 U 1 U 2 F 1 F 2 T 1 T 2 S N c (0, σ 2 S) and for k = 1,..., M, W k N c (0, σ 2 W) Src S G Ŝ Dest For large R tot, the behavior is W M U M F M T M D CEO (R tot ) = σ2 W R tot.
17 Gaussian Sensor Network: Bits (2/2) Channel coding part. Additive white Gaussian multi-access channel: ( R sum log MP ) σz 2. However, the codewords may be dependent. Therefore, the sum rate may be up to ( R sum log M 2 ) P σz 2. Hence, the distortion for a system that satisfies the rate-matching condition is at least Is this optimal? D rm (M) σ 2 W log 2 (1 + M 2 P σ 2 Z )
18 Gaussian Sensor Network: Uncoded transmission Consider instead the following coding strategy: W 1 U 1 α 1 X 1 M k=1 E X k 2 MP W 2 U 2 α 2 X 2 Z Source S Y G Ŝ Dest W M U M α M X M
19 Gaussian Sensor Network: Uncoded transmission Strategy: The sensors transmit whatever they measure, scaled to their power constraint, without any coding at all. Y [n] = P σ 2 S + σ2 W ( MS[n] + ) M W k [n] + Z[n]. If the decoder is the minimum mean-squared error estimate of S based on Y, the following distortion is incurred: k=1 Proposition 1. Uncoded transmission achieves D 1 (MP ) = σs 2σ2 W M 2 M+(σZ 2 /σ2 W )(σ2 S +σ2 W )/P σ2 S + σ2 W. This is better than separation (D rm 1/ log M). In this sense, uncoded transmission beats capacity. Is it optimal?
20 Gaussian Sensor Network: An outer bound Suppose the decoder has direct access to U 1, U 2,..., U M. W 1 W 2 U 1 U 2 The smallest distortion for our sensor network cannot be smaller than the smallest distortion for the idealization. Src S G Ŝ Dest D min,ideal = σ 2 S σ2 W Mσ 2 S + σ2 W W M U M
21 Gaussian Sensor Network: Asymptotic optimum Rate-matching: D rm (MP ) σ 2 W log 2 (1 + M 2 P σ 2 Z ) Uncoded transmission: D 1 (MP ) = σs 2σ2 W M 2 M+(σZ 2 /σ2 W )(σ2 S +σ2 W )/P σ2 S + σ2 W. Proposition 2. As the number of sensors becomes large, the optimum trade-off is D(MP ) σ 2 S σ2 W MσS 2 +. σ2 W
22 Gaussian Sensor Network: Conclusions Two conclusions from the Gaussian sensor network example: 1. Uncoded transmission is asymptotically optimal. This leads to a general measure-matching condition. 2. Even for finite M, uncoded transmission considerably outperforms the best separation-based coding strategies. This suggests an alternative coding paradigm for source-channel networks.
23 Sensor Network: Measure-matching Theorem. If the coding system F 1, F 2,..., F M, G satisfies the cost constraint Eρ(X 1, X 2,..., X M ) Γ, and then it performs optimally. d(s, ŝ) = log 2 p(s ŝ) I(S; U 1 U 2... U M ) = I(S; Ŝ), U 1 F 1 X 1 U 2 F 2 X 2 Src S Chan Y G Ŝ Dest U M F M X M
24 Proof: Cut-sets Outer bound on the capacity region of a network: X 1. X M S S c Y If the rates (R 1, R 2,..., R M ) are achievable, they must satisfy, for every cut S: R k max I(X S; Y S c X S c) S S c p(x 1,x 2,...,x M ) Hence, if a scheme satisfies, for some cut S, the above with equality, then it is optimal (with respect to S). Remark. This can be sharpened. If the rates (R 1, R 2,..., R M ) are achievable, then there exists some joint probability distribution p(x 1, x 2,..., x M ) such that for every cut S: S S c R k I(X S ; Y S c X S c)
25 Source-channel Cut-sets (1/2) Fix the coding scheme (F 1, F 2,..., F M, G). Is it optimal? Place any source-channel cut through the source-channel network. S U 1. U M X 1. X M Y Ŝ Sufficient condition for optimality: R S ( ) = C (X1,X 2,...,X M ) Y (Γ). Gaussian: D σ2 S σ2 Z M 2 P + σz 2. Equivalently, using measure-matching conditions, ρ(x 1, x 2,..., x M ) = D(p Y x1,x 2...,x M p Y ) d(s, ŝ) = log 2 p(s ŝ) I(S; Ŝ) = I(X 1X 2... X M ; Y )
26 Source-channel Cut-sets (2/2) Fix the coding scheme (F 1, F 2,..., F M, G). Is it optimal? Place any source-channel cut through the source-channel network. S U 1. U M X 1. X M Y Ŝ Sufficient condition for optimality: R S ( ) = C S (U1,U 2,...,U M )(Γ). Gaussian: D σ2 S σ2 W MσS 2 +. σ2 W Equivalently, using measure-matching conditions, ρ(s) = D(p U1,U 2...,U M s p U1,U 2...,U M ) d(s, ŝ) = log 2 p(s ŝ) I(S; Ŝ) = I(S; U 1U 2... U M )
27 Sensor Network: Measure-matching Theorem. If the coding system F 1, F 2,..., F M, G satisfies the cost constraint Eρ(X 1, X 2,..., X M ) Γ, and then it performs optimally. d(s, ŝ) = log 2 p(s ŝ) I(S; U 1 U 2... U M ) = I(S; Ŝ), U 1 F 1 X 1 U 2 F 2 X 2 Src S Chan Y G Ŝ Dest U M F M X M
28 Gaussian Example 1. The uncoded scheme satisfies the condition d(s, ŝ) = log 2 p(s ŝ) for any M since p(s ŝ) is Gaussian. More generally, this is true as soon as the sum of the measurement noises W k, k = 1,..., M, is Gaussian. 2. For the mutual information, for large M, I(S; U 1 U 2... U M ) I(S; Ŝ) c 1 log 2 ( 1 + c 2 M 2 ) M, hence the second measure-matching condition is approached as M.
29 Measure-matching as a coding paradigm Second observation from the Gaussian sensor network example: 2. Even for finite M, uncoded transmission considerably outperforms the best separation-based coding strategies. Coding Paradigm. The goal of the coding scheme in the sensor network topology is to approach as closely as possible. d(s, ŝ) = log 2 p(s ŝ) I(S; U 1 U 2... U M ) = I(S; Ŝ), The precise meaning of as closely as possible remains to be determined.
30 Slightly Extended Topologies Communication between the sensors S U 1. U M X 1. X M Y Ŝ Sensors assisted by relays S U 1. U M X 1 R 1. X M Y Ŝ
31 Slightly Extended Topologies Key insight: The same outer bound applies. Hence, But: the same measure-matching condition applies, and in the Gaussian scenario, uncoded transmission, ignoring the communication between the sensors, and/or the relay, is asymptotically optimal. Communication between the sensors simplifies the task of matching the measures. Relays simplify the task of matching the measures. Can this be quantified?
32 Conclusions Rate-matching: Yields some achievable cost-distortion pairs for arbitrary network topologies. Measure-matching: Yields some optimal cost-distortion pairs for certain network topologies, including single-source broadcast sensor network sensor network with communication between the sensors sensor network with relays
33 What is Information? Point-to-point: Network: Information = Bits Information =??? References: 1. M. Gastpar and M. Vetterli, Source-channel communication in sensor networks, IPSN 2003 and Springer Lecture Notes in Computer Science, April M. Gastpar, Cut-set bounds for source-channel networks, in preparation.
Uncoded transmission is exactly optimal for a simple Gaussian sensor network
Uncoded transmission is exactly optimal for a simple Gaussian sensor network Michael Gastpar University of California, Berkeley ireless Foundations Center, ept of EEC, Berkeley, CA 9470-770, UA Email:
More informationReliable Computation over Multiple-Access Channels
Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,
More informationOn Capacity Under Received-Signal Constraints
On Capacity Under Received-Signal Constraints Michael Gastpar Dept. of EECS, University of California, Berkeley, CA 9470-770 gastpar@berkeley.edu Abstract In a world where different systems have to share
More informationJoint Source-Channel Coding for the Multiple-Access Relay Channel
Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il
More informationInformation Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results
Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel
More informationThe Distributed Karhunen-Loève Transform
1 The Distributed Karhunen-Loève Transform Michael Gastpar, Member, IEEE, Pier Luigi Dragotti, Member, IEEE, and Martin Vetterli, Fellow, IEEE Manuscript received November 2004; revised March 5, 2006;
More informationFeedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case
1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department
More informationEnergy Efficient Estimation of Gaussian Sources Over Inhomogeneous Gaussian MAC Channels
Energy Efficient Estimation of Gaussian Sources Over Inhomogeneous Gaussian MAC Channels Shuangqing Wei, Ragopal Kannan, Sitharama Iyengar and Nageswara S. Rao Abstract In this paper, we first provide
More informationIEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 5177 The Distributed Karhunen Loève Transform Michael Gastpar, Member, IEEE, Pier Luigi Dragotti, Member, IEEE, and Martin Vetterli,
More informationThe Sensor Reachback Problem
Submitted to the IEEE Trans. on Information Theory, November 2003. 1 The Sensor Reachback Problem João Barros Sergio D. Servetto Abstract We consider the problem of reachback communication in sensor networks.
More informationOn the Capacity of the Interference Channel with a Relay
On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due
More informationVariable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationAsymptotic Distortion Performance of Source-Channel Diversity Schemes over Relay Channels
Asymptotic istortion Performance of Source-Channel iversity Schemes over Relay Channels Karim G. Seddik 1, Andres Kwasinski 2, and K. J. Ray Liu 1 1 epartment of Electrical and Computer Engineering, 2
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationCooperative Communication with Feedback via Stochastic Approximation
Cooperative Communication with Feedback via Stochastic Approximation Utsaw Kumar J Nicholas Laneman and Vijay Gupta Department of Electrical Engineering University of Notre Dame Email: {ukumar jnl vgupta}@ndedu
More informationA Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels
A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationLecture 2: August 31
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationA Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying
A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,
More informationLECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem
LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser
More informationAN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN
AN INFORMATION THEORY APPROACH TO WIRELESS SENSOR NETWORK DESIGN A Thesis Presented to The Academic Faculty by Bryan Larish In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy
More informationMMSE DECODING FOR ANALOG JOINT SOURCE CHANNEL CODING USING MONTE CARLO IMPORTANCE SAMPLING
MMSE DECODING FOR ANALOG JOINT SOURCE CHANNEL CODING USING MONTE CARLO IMPORTANCE SAMPLING Yichuan Hu (), Javier Garcia-Frias () () Dept. of Elec. and Comp. Engineering University of Delaware Newark, DE
More informationInteractive Decoding of a Broadcast Message
In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,
More informationRematch and Forward: Joint Source-Channel Coding for Communications
Background ρ = 1 Colored Problem Extensions Rematch and Forward: Joint Source-Channel Coding for Communications Anatoly Khina Joint work with: Yuval Kochman, Uri Erez, Ram Zamir Dept. EE - Systems, Tel
More informationInformation Theory Meets Game Theory on The Interference Channel
Information Theory Meets Game Theory on The Interference Channel Randall A. Berry Dept. of EECS Northwestern University e-mail: rberry@eecs.northwestern.edu David N. C. Tse Wireless Foundations University
More informationOn Common Information and the Encoding of Sources that are Not Successively Refinable
On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa
More informationError Exponent Region for Gaussian Broadcast Channels
Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationStrong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach
Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University
More informationTHE multiple-access relay channel (MARC) is a multiuser
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 10, OCTOBER 2014 6231 On Joint Source-Channel Coding for Correlated Sources Over Multiple-Access Relay Channels Yonathan Murin, Ron Dabora, and Deniz
More informationA Comparison of Two Achievable Rate Regions for the Interference Channel
A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg
More informationHalf-Duplex Gaussian Relay Networks with Interference Processing Relays
Half-Duplex Gaussian Relay Networks with Interference Processing Relays Bama Muthuramalingam Srikrishna Bhashyam Andrew Thangaraj Department of Electrical Engineering Indian Institute of Technology Madras
More informationPolar codes for the m-user MAC and matroids
Research Collection Conference Paper Polar codes for the m-user MAC and matroids Author(s): Abbe, Emmanuel; Telatar, Emre Publication Date: 2010 Permanent Link: https://doi.org/10.3929/ethz-a-005997169
More informationRevision of Lecture 5
Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information
More information6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011
6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 On the Structure of Real-Time Encoding and Decoding Functions in a Multiterminal Communication System Ashutosh Nayyar, Student
More informationPerformance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)
Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song
More informationInterference Channel aided by an Infrastructure Relay
Interference Channel aided by an Infrastructure Relay Onur Sahin, Osvaldo Simeone, and Elza Erkip *Department of Electrical and Computer Engineering, Polytechnic Institute of New York University, Department
More informationAn Outer Bound for the Gaussian. Interference channel with a relay.
An Outer Bound for the Gaussian Interference Channel with a Relay Ivana Marić Stanford University Stanford, CA ivanam@wsl.stanford.edu Ron Dabora Ben-Gurion University Be er-sheva, Israel ron@ee.bgu.ac.il
More informationOn the Capacity of the Binary-Symmetric Parallel-Relay Network
On the Capacity of the Binary-Symmetric Parallel-Relay Network Lawrence Ong, Sarah J. Johnson, and Christopher M. Kellett arxiv:207.3574v [cs.it] 6 Jul 202 Abstract We investigate the binary-symmetric
More informationOn the Capacity Region of the Gaussian Z-channel
On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu
More informationRepresentation of Correlated Sources into Graphs for Transmission over Broadcast Channels
Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu
More informationThe Capacity Region of the Gaussian Cognitive Radio Channels at High SNR
The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract
More informationSource and Channel Coding for Correlated Sources Over Multiuser Channels
Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which
More informationLecture 11: Quantum Information III - Source Coding
CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that
More information(Classical) Information Theory II: Source coding
(Classical) Information Theory II: Source coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract The information content of a random variable
More informationII. THE TWO-WAY TWO-RELAY CHANNEL
An Achievable Rate Region for the Two-Way Two-Relay Channel Jonathan Ponniah Liang-Liang Xie Department of Electrical Computer Engineering, University of Waterloo, Canada Abstract We propose an achievable
More informationCoding into a source: an inverse rate-distortion theorem
Coding into a source: an inverse rate-distortion theorem Anant Sahai joint work with: Mukul Agarwal Sanjoy K. Mitter Wireless Foundations Department of Electrical Engineering and Computer Sciences University
More informationAchieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel
Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of
More informationEfficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel
Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual
More informationMultiuser Successive Refinement and Multiple Description Coding
Multiuser Successive Refinement and Multiple Description Coding Chao Tian Laboratory for Information and Communication Systems (LICOS) School of Computer and Communication Sciences EPFL Lausanne Switzerland
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationChapter 9 Fundamental Limits in Information Theory
Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For
More informationLossy Distributed Source Coding
Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,
More informationThe Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel
The Capacity of the Semi-Deterministic Cognitive Interference Channel and its Application to Constant Gap Results for the Gaussian Channel Stefano Rini, Daniela Tuninetti, and Natasha Devroye Department
More informationOn the Energy-Distortion Tradeoff of Gaussian Broadcast Channels with Feedback
On the Energy-Distortion Tradeoff of Gaussian Broadcast Channels with Feedback Yonathan Murin, Yonatan Kaspi, Ron Dabora 3, and Deniz Gündüz 4 Stanford University, USA, University of California, San Diego,
More informationMismatched Multi-letter Successive Decoding for the Multiple-Access Channel
Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org
More informationHypothesis Testing with Communication Constraints
Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline
More informationAn Achievable Rate for the Multiple Level Relay Channel
An Achievable Rate for the Multiple Level Relay Channel Liang-Liang Xie and P. R. Kumar Department of Electrical and Computer Engineering, and Coordinated Science Laboratory University of Illinois, Urbana-Champaign
More informationCut-Set Bound and Dependence Balance Bound
Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems
More informationRandom Access: An Information-Theoretic Perspective
Random Access: An Information-Theoretic Perspective Paolo Minero, Massimo Franceschetti, and David N. C. Tse Abstract This paper considers a random access system where each sender can be in two modes of
More informationEnergy State Amplification in an Energy Harvesting Communication System
Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu
More informationLecture 10: Broadcast Channel and Superposition Coding
Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional
More informationEquivalence for Networks with Adversarial State
Equivalence for Networks with Adversarial State Oliver Kosut Department of Electrical, Computer and Energy Engineering Arizona State University Tempe, AZ 85287 Email: okosut@asu.edu Jörg Kliewer Department
More informationA New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality
0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California
More informationOptimal Power Control in Decentralized Gaussian Multiple Access Channels
1 Optimal Power Control in Decentralized Gaussian Multiple Access Channels Kamal Singh Department of Electrical Engineering Indian Institute of Technology Bombay. arxiv:1711.08272v1 [eess.sp] 21 Nov 2017
More informationA Comparison of Superposition Coding Schemes
A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA
More informationSum Capacity of General Deterministic Interference Channel with Channel Output Feedback
Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback Achaleshwar Sahai Department of ECE, Rice University, Houston, TX 775. as27@rice.edu Vaneet Aggarwal Department of
More informationMixed Noisy Network Coding
Mixed Noisy Networ Coding Arash Behboodi Dept of elecommunication Systems, U Berlin 10587 Berlin, Germany Email: behboodi@tntu-berlinde Pablo Piantanida Dept of elecommunications, SUPELEC 9119 Gif-sur-Yvette,
More informationAN INTRODUCTION TO SECRECY CAPACITY. 1. Overview
AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits
More informationThe Gallager Converse
The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing
More informationA tool oriented approach to network capacity. Ralf Koetter Michelle Effros Muriel Medard
A tool oriented approach to network capacity Ralf Koetter Michelle Effros Muriel Medard ralf.koetter@tum.de effros@caltech.edu medard@mit.edu The main theorem of NC [Ahlswede, Cai, Li Young, 2001] Links
More informationSubset Universal Lossy Compression
Subset Universal Lossy Compression Or Ordentlich Tel Aviv University ordent@eng.tau.ac.il Ofer Shayevitz Tel Aviv University ofersha@eng.tau.ac.il Abstract A lossy source code C with rate R for a discrete
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationEntropies & Information Theory
Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information
More information5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010
5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora
More informationLecture 1: The Multiple Access Channel. Copyright G. Caire 12
Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user
More informationInterference Channels with Source Cooperation
Interference Channels with Source Cooperation arxiv:95.319v1 [cs.it] 19 May 29 Vinod Prabhakaran and Pramod Viswanath Coordinated Science Laboratory University of Illinois, Urbana-Champaign Urbana, IL
More informationarxiv: v1 [cs.it] 4 Jun 2018
State-Dependent Interference Channel with Correlated States 1 Yunhao Sun, 2 Ruchen Duan, 3 Yingbin Liang, 4 Shlomo Shamai (Shitz) 5 Abstract arxiv:180600937v1 [csit] 4 Jun 2018 This paper investigates
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationSide Information Aware Coding Strategies for Estimation under Communication Constraints
MIT Research Laboratory of Electronics, Tech. Report # 704 Side Information Aware Coding Strategies for Estimation under Communication Constraints Stark C. Draper and Gregory W. Wornell Abstract We develop
More informationOn the Duality of Gaussian Multiple-Access and Broadcast Channels
On the Duality of Gaussian ultiple-access and Broadcast Channels Xiaowei Jin I. INTODUCTION Although T. Cover has been pointed out in [] that one would have expected a duality between the broadcast channel(bc)
More informationOn Side-Informed Coding of Noisy Sensor Observations
On Side-Informed Coding of Noisy Sensor Observations Chao Yu and Gaurav Sharma C Dept, University of Rochester, Rochester NY 14627 ABSTRACT In this paper, we consider the problem of side-informed coding
More informationMultiple-Input Multiple-Output Systems
Multiple-Input Multiple-Output Systems What is the best way to use antenna arrays? MIMO! This is a totally new approach ( paradigm ) to wireless communications, which has been discovered in 95-96. Performance
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel
More informationSource-Channel Coding Techniques in the Presence of Interference and Noise
Source-Channel Coding Techniques in the Presence of Interference and Noise by Ahmad Abou Saleh A thesis submitted to the Department of Electrical and Computer Engineering in conformity with the requirements
More informationLattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function
Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,
More informationOn Multiple User Channels with State Information at the Transmitters
On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu
More informationOn the Power Allocation for Hybrid DF and CF Protocol with Auxiliary Parameter in Fading Relay Channels
On the Power Allocation for Hybrid DF and CF Protocol with Auxiliary Parameter in Fading Relay Channels arxiv:4764v [csit] 4 Dec 4 Zhengchuan Chen, Pingyi Fan, Dapeng Wu and Liquan Shen Tsinghua National
More informationCommunication Games on the Generalized Gaussian Relay Channel
Communication Games on the Generalized Gaussian Relay Channel Dileep M. alathil Department of Electrical Engineering University of Southern California manisser@usc.edu Rahul Jain EE & ISE Departments University
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationSelective Coding Strategy for Unicast Composite Networks
Selective Coding Strategy for Unicast Composite Networs Arash Behboodi Dept of elecommunications, SUPELEC 91192 Gif-sur-Yvette, France Email: arashbehboodi@supelecfr Pablo Piantanida Dept of elecommunications,
More informationSource-Channel Coding Theorems for the Multiple-Access Relay Channel
Source-Channel Coding Theorems for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora, and Deniz Gündüz Abstract We study reliable transmission of arbitrarily correlated sources over multiple-access
More informationCompressed Sensing Using Bernoulli Measurement Matrices
ITSchool 11, Austin Compressed Sensing Using Bernoulli Measurement Matrices Yuhan Zhou Advisor: Wei Yu Department of Electrical and Computer Engineering University of Toronto, Canada Motivation Motivation
More informationRECENT advances in technology have led to increased activity
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL 49, NO 9, SEPTEMBER 2004 1549 Stochastic Linear Control Over a Communication Channel Sekhar Tatikonda, Member, IEEE, Anant Sahai, Member, IEEE, and Sanjoy Mitter,
More informationA Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding
A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation
More informationThe Role of Directed Information in Network Capacity
The Role of Directed Information in Network Capacity Sudeep Kamath 1 and Young-Han Kim 2 1 Department of Electrical Engineering Princeton University 2 Department of Electrical and Computer Engineering
More information