Distributed Lossless Compression. Distributed lossless compression system
|
|
- Lindsey Jennings
- 5 years ago
- Views:
Transcription
1 Lecture #3 Distributed Lossless Compression (Reading: NIT , 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf theorem Extension to more than two sources Copyright Abbas El Gamal and Young-Han Kim Distributed lossless compression system X n 1 X n 2 Encoder 1 Encoder 2 M 1 M 2 Decoder X n 1, X n 2 Two-component DMS (2-DMS) (X 1 X 2, p(x 1, x 2 )) A (2 nr 1, 2 nr 2, n) code: Two encoders: m 1 (x n 1 ) [1 : 2nR 1 ) and m 2 (x n 2 ) [1 : 2nR 2 ) Decoder ( x n 1, xn 2 )(m 1, m 2 ) Probability of error: P (n) e = P{( X n 1, X n 2 ) (Xn 1, Xn 2 )} (R 1, R 2 ) achievable if (2 nr 1, 2 nr 2, n) codes such that lim n P (n) e = 0 Optimal rate region R : Closure of the set of achievable (R 1, R 2 ) 2 / 17
2 Bounds on the optimal rate region R 2 Outer bound Inner bound H(X 2 )? H(X 2 X 1 ) H(X 1 X 2 ) H(X 1 ) R 1 Sufficient condition for individual compression: R 1 > H(X 1 ), R 2 > H(X 2 ) Necessary condition for centralized compression: R 1 + R 2 H(X 1, X 2 ) Can also show that R 1 H(X 1 X 2 ), R 2 H(X 2 X 1 ) are necessary Slepian Wolf theorem 3 / 17 R 2 H(X 2 ) H(X 2 X 1 ) H(X 1 X 2 ) H(X 1 ) R 1 Theorem 10.1 (Slepian Wolf 1973) The optimal rate region R is the set of (R 1, R 2 ) such that R 1 H(X 1 X 2 ), R 2 H(X 2 X 1 ), R 1 + R 2 H(X 1, X 2 ) 4 / 17
3 Example Doubly symmetric binary source (DSBS(p)) (X 1, X 2 ) 1 p 1/ /2 X 2 X 1 X 1 X 2 0 1/2 0 1 p 0 1/ p 2 1 p 2 p 2 1 p 2 Let p = 0.01 Individual compression: 2 bits/symbol-pair Slepian Wolf coding: H(X 1, X 2 ) = bits/symbol-pair Lossless source coding via random binning 5 / 17 Codebook generation: Randomly assign an index m(x n ) [1 : 2 nr ] to each sequence x n X n The set of sequences with the same index m form a bin B(m), m [1 : 2 nr ] Bin assignments are revealed to the encoder and decoder Encoding: Upon observing x n B(m), send the bin index m Decoding: Find the unique typical sequence x n B(m) B(1) B(m) B(2 nr ) x n T (n) 6 / 17
4 Lossless source coding via random binning Codebook generation: Randomly assign an index m(x n ) [1 : 2 nr ] to each sequence x n X n The set of sequences with the same index m form a bin B(m), m [1 : 2 nr ] Bin assignments are revealed to the encoder and decoder Encoding: Upon observing x n B(m), send the bin index m Decoding: Find the unique typical sequence x n B(m) x n B(1) B(m) B(2 nr ) Lossless source coding via random binning 6 / 17 Codebook generation: Randomly assign an index m(x n ) [1 : 2 nr ] to each sequence x n X n The set of sequences with the same index m form a bin B(m), m [1 : 2 nr ] Bin assignments are revealed to the encoder and decoder Encoding: Upon observing x n B(m), send the bin index m Decoding: Find the unique typical sequence x n B(m) x n B(1) B(m) B(2 nr ) 6 / 17
5 Analysis of the probability of error We bound P (n) e averaged over random bin assignments Let M denote the random bin index of X n, i.e., X n B(M) Note that M Unif[1 : 2 nr ], independent of X n Error events: Thus By the LLN, P(E 1 ) 0 E 1 = {X n T (n) }, or E 2 = { x n B(M) for some x n X n, x n T (n) } P(E) P(E 1 ) + P(E 2 ) = P(E 1 ) + P(E 2 X n B(1)) Analysis of the probability of error 7 / 17 Consider P(E 2 X n B(1)) = P X n = x n X n B(1) x n P x n B(1) for some x n x n, x n T (n) x n B(1), X n = x n p(x n ) P x n B(1) x n B(1), X n = x n x n x n T (n) x n x n = p(x n ) P{ x n B(1)} x n x n T (n) x n x n T (n) 2 nr 2 n(h(x)+δ()) 2 nr Hence P(E 2 ) 0 as n if R > H(X) + δ() 8 / 17
6 Achievability via linear binning Let X be a Bern(p) source R = H(X) achieved via linear binning (hashing) Let H be a randomly generated nr n binary parity-check matrix Encoder sends HX n Decoder recovers X n with high probability if R > H(p) (why?) Time sharing 9 / 17 Proposition 4.1 If (R 1, R 2 ), (R 1, R 2 ) R, then (R 1, R 2 ) = (αr 1 + ᾱr 1, αr 2 + ᾱr 2 ) R for α [0, 1] The rate region R is convex R 2 (R 1, R 2 ) (R 1, R 2 ) (R 1, R 2 ) R 1 10 / 17
7 Time sharing Proposition 4.1 If (R 1, R 2 ), (R 1, R 2 ) R, then (R 1, R 2 ) = (αr 1 + ᾱr 1, αr 2 + ᾱr 2 ) R for α [0, 1] The rate region R is convex Proof: Time sharing argument Let C k be a sequence of (2kR 1, 2 kr 2, k) codes with P (k) e1 0 Let C k be a sequence of (2kR 1, 2 kr 2, k) codes with P (k) e2 0 Construct a new sequence of codes by using C αn αn for i [1 : αn] and C ᾱn ᾱn for i [αn + 1 : n] C αn C ᾱn By the union of events bound, P (n) e Remarks: P (αn) e1 + P (ᾱn) e2 0 Time division is a special case of time sharing (between (R 1, 0) and (0, R 2 )) The rate (capacity) region of any source (channel) coding problem is convex Achievability proof of the S W theorem (Cover 1975) 10 / 17 We show that the corner point (H(X 1 ), H(X 2 X 1 )) is achievable Achievability of the other corner point (H(X 1 X 2 )), H(X 2 )) follows similarly The rest of the region is achieved using time sharing R 2 H(X 2 ) H(X 2 X 1 ) H(X 1 X 2 ) H(X 1 ) R 1 11 / 17
8 Achievability proof of the S W theorem (Cover 1975) We show that the corner point (H(X 1 ), H(X 2 X 1 )) is achievable Achievability of the other corner point (H(X 1 X 2 )), H(X 2 )) follows similarly The rest of the region is achieved using time sharing R 2 H(X 2 ) H(X 2 X 1 ) H(X 1 X 2 ) H(X 1 ) R 1 Achievability proof of the S W theorem (Cover 1975) 11 / 17 We show that the corner point (H(X 1 ), H(X 2 X 1 )) is achievable Achievability of the other corner point (H(X 1 X 2 )), H(X 2 )) follows similarly The rest of the region is achieved using time sharing R 2 H(X 2 ) H(X 2 X 1 ) H(X 1 X 2 ) H(X 1 ) R 1 11 / 17
9 Achievability of (H(X 1 ), H(X 2 X 1 )) Codebook generation: Assign a distinct index m 1 [1 : 2 nr 1 ] to each x n 1 T (n) (X 1 ), and m 1 = 1, otherwise Randomly assign an index m 2 (x n 2 ) [1 : 2nR 2 ] to each x n 2 X n 2 The sequences with the same index m 2 form a bin B(m 2 ) x n m 1 2 nr 1 x n 2 B(1) B(m 2 ) T (n) (X 1, X 2 ) B(2 nr 2 ) Achievability of (H(X 1 ), H(X 2 X 1 )) 12 / 17 Encoding: Upon observing x n 1, encoder 1 sends the index m 1(x n 1 ) Upon observing x n 2 B(m 2), encoder 2 sends m 2 x n nr 1 m 1 x n 2 B(1) B(m 2 ) (x n 1, xn 2 ) B(2 nr 2 ) 12 / 17
10 Achievability of (H(X 1 ), H(X 2 X 1 )) Decoding: Sources recovered successively Declare x n 1 = x n 1 (m 1) for the unique x n 1 (m 1) T (n) (X 1 ) Find the unique x n 2 B(m 2) T (n) (X 2 x n 1 ) If there is none or more than one, the decoder declares an error x n nr 1 m 1 x n 2 B(1) B(m 2 ) ( x n 1, xn 2 ) B(2 nr 2 ) Analysis of the probability of error 12 / 17 Let M 1 and M 2 denote the random bin indices for X n 1 and Xn 2 Error events: Then, P(E 1 ) 0 by LLN E 1 = (X n 1, Xn 2 ) T (n), E 2 = x n 2 B(M 2) for some x n 2 Xn 2, (Xn 1, xn 2 ) T (n) P(E) P(E 1 ) + P(E 2 ) 13 / 17
11 Analysis of the probability of error By symmetry, P(E 2 ) = P(E 2 X n 2 B(1)) = P{(X n 1, Xn 2 ) = (xn 1, xn 2 ) Xn 2 B(1)} (x n 1,xn 2 ) P x n 2 B(1) for some xn 2 xn 2, (xn 1, xn 2 ) T (n) xn 2 B(1), (X n 1, Xn 2 ) = (xn 1, xn 2 ) p(x n 1, xn 2 ) (x n 1,xn 2 ) x n (n) 2 T 2 n(h(x 2 X 1 )+δ()) 2 nr 2 (X 2 x n 1 ) x n 2 xn 2 P{ x n 2 B(1)} Hence, P(E 2 ) 0 as n if R 2 > H(X 2 X 1 ) + δ() Extension to more than two sources 14 / 17 Theorem 10.3 The optimal rate region R (X 1,..., X k ) for the k-dms (X 1,..., X k ) is the set of (R 1,..., R k ) such that R j H(X(S) X(S c )) for all S [1 : k] j S For k = 3, R (X 1, X 2, X 3 ) is the set of (R 1, R 2, R 3 ) such that R 1 H(X 1 X 2, X 3 ), R 2 H(X 2 X 1, X 3 ), R 3 H(X 3 X 1, X 2 ), R 1 + R 2 H(X 1, X 2 X 3 ), R 1 + R 3 H(X 1, X 3 X 2 ), R 2 + R 3 H(X 2, X 3 X 1 ), R 1 + R 2 + R 3 H(X 1, X 2, X 3 ) 15 / 17
12 Summary k-component discrete memoryless source (k-dms) Distributed lossless source coding for a k-dms: Slepian Wolf optimal rate region Random binning Time sharing References 16 / 17 Cover, T. M. (1975). A proof of the data compression theorem of Slepian and Wolf for ergodic sources. IEEE Trans. Inf. Theory, 21(2), Slepian, D. and Wolf, J. K. (1973). Noiseless coding of correlated information sources. IEEE Trans. Inf. Theory, 19(4), / 17
EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions
EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where
More informationCommon Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014
Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2
More informationLecture 1: The Multiple Access Channel. Copyright G. Caire 12
Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user
More informationInformation Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results
Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel
More informationEE5585 Data Compression May 2, Lecture 27
EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationA Comparison of Superposition Coding Schemes
A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA
More informationNetwork coding for multicast relation to compression and generalization of Slepian-Wolf
Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues
More informationInformation Theoretic Limits of Randomness Generation
Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication
More informationSimultaneous Nonunique Decoding Is Rate-Optimal
Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 1-5, 2012 Simultaneous Nonunique Decoding Is Rate-Optimal Bernd Bandemer University of California, San Diego La Jolla, CA
More informationOn Multiple User Channels with State Information at the Transmitters
On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationCommunication Cost of Distributed Computing
Communication Cost of Distributed Computing Abbas El Gamal Stanford University Brice Colloquium, 2009 A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 1 / 41 Motivation
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More informationA Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying
A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,
More informationLecture 22: Final Review
Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information
More informationLecture 3: Channel Capacity
Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationSecret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper
Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Mohsen Bahrami, Ali Bereyhi, Mahtab Mirmohseni and Mohammad Reza Aref Information Systems and Security
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationAn Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel
Forty-Ninth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 8-3, An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel Invited Paper Bernd Bandemer and
More informationInteractive Hypothesis Testing with Communication Constraints
Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October - 5, 22 Interactive Hypothesis Testing with Communication Constraints Yu Xiang and Young-Han Kim Department of Electrical
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationPerformance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)
Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song
More informationThe Gallager Converse
The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationDigital Communications III (ECE 154C) Introduction to Coding and Information Theory
Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I
More informationOn the Capacity Region for Secure Index Coding
On the Capacity Region for Secure Index Coding Yuxin Liu, Badri N. Vellambi, Young-Han Kim, and Parastoo Sadeghi Research School of Engineering, Australian National University, {yuxin.liu, parastoo.sadeghi}@anu.edu.au
More informationInteractive Decoding of a Broadcast Message
In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,
More informationDraft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor
Preliminaries Counterexample Better Use On Limiting Expressions for the Capacity Region of Gaussian Interference Channels Mojtaba Vaezi and H. Vincent Poor Department of Electrical Engineering Princeton
More informationOn Gaussian MIMO Broadcast Channels with Common and Private Messages
On Gaussian MIMO Broadcast Channels with Common and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ersen@umd.edu
More informationX 1 : X Table 1: Y = X X 2
ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access
More informationLecture 10: Broadcast Channel and Superposition Coding
Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional
More informationInformation Masking and Amplification: The Source Coding Setting
202 IEEE International Symposium on Information Theory Proceedings Information Masking and Amplification: The Source Coding Setting Thomas A. Courtade Department of Electrical Engineering University of
More informationUCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)
UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, 2017 Solutions to Take-Home Midterm (Prepared by Pinar Sen) 1. (30 points) Erasure broadcast channel. Let p(y 1,y 2 x) be a discrete
More informationDistributed Functional Compression through Graph Coloring
Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology
More informationSOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003
SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the
More informationEntropy as a measure of surprise
Entropy as a measure of surprise Lecture 5: Sam Roweis September 26, 25 What does information do? It removes uncertainty. Information Conveyed = Uncertainty Removed = Surprise Yielded. How should we quantify
More informationLossy Distributed Source Coding
Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,
More informationA Comparison of Two Achievable Rate Regions for the Interference Channel
A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg
More informationSecret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe
Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology
More informationCoding Techniques for Primitive Relay Channels
Forty-Fifth Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 26-28, 2007 WeB1.2 Coding Techniques for Primitive Relay Channels Young-Han Kim Abstract We give a comprehensive discussion
More informationOn Function Computation with Privacy and Secrecy Constraints
1 On Function Computation with Privacy and Secrecy Constraints Wenwen Tu and Lifeng Lai Abstract In this paper, the problem of function computation with privacy and secrecy constraints is considered. The
More informationFrans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)
Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication
More informationAn Achievable Error Exponent for the Mismatched Multiple-Access Channel
An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of
More informationUCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5
UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, 2017 Solutions to Homework Set #5 3.18 Bounds on the quadratic rate distortion function. Recall that R(D) = inf F(ˆx x):e(x ˆX)2 DI(X; ˆX).
More informationarxiv: v1 [cs.it] 5 Feb 2016
An Achievable Rate-Distortion Region for Multiple Descriptions Source Coding Based on Coset Codes Farhad Shirani and S. Sandeep Pradhan Dept. of Electrical Engineering and Computer Science Univ. of Michigan,
More informationAdvanced Topics in Information Theory
Advanced Topics in Information Theory Lecture Notes Stefan M. Moser c Copyright Stefan M. Moser Signal and Information Processing Lab ETH Zürich Zurich, Switzerland Institute of Communications Engineering
More informationA Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers
A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers Peiyu Tan and Jing Li (Tiffany) Electrical and Computer Engineering Dept, Lehigh
More informationPaul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University {cuff, hanisu,
Cascade Multiterminal Source Coding Paul Cuff, Han-I Su, and Abbas EI Gamal Department of Electrical Engineering Stanford University E-mail: {cuff, hanisu, abbas}@stanford.edu Abstract-We investigate distributed
More informationRemote Source Coding with Two-Sided Information
Remote Source Coding with Two-Sided Information Basak Guler Ebrahim MolavianJazi Aylin Yener Wireless Communications and Networking Laboratory Department of Electrical Engineering The Pennsylvania State
More informationSolutions to Homework Set #1 Sanov s Theorem, Rate distortion
st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence
More informationReliable Computation over Multiple-Access Channels
Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,
More informationDegrees of Freedom Region of the Gaussian MIMO Broadcast Channel with Common and Private Messages
Degrees of Freedom Region of the Gaussian MIMO Broadcast hannel with ommon and Private Messages Ersen Ekrem Sennur Ulukus Department of Electrical and omputer Engineering University of Maryland, ollege
More informationChannel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter
Channel Dependent Adaptive Modulation and Coding Without Channel State Information at the Transmitter Bradford D. Boyle, John MacLaren Walsh, and Steven Weber Modeling & Analysis of Networks Laboratory
More informationCapacity of a Class of Deterministic Relay Channels
Capacity of a Class of Deterministic Relay Channels Thomas M. Cover Information Systems Laboratory Stanford University Stanford, CA 94305, USA cover@ stanford.edu oung-han Kim Department of ECE University
More informationCapacity of channel with energy harvesting transmitter
IET Communications Research Article Capacity of channel with energy harvesting transmitter ISSN 75-868 Received on nd May 04 Accepted on 7th October 04 doi: 0.049/iet-com.04.0445 www.ietdl.org Hamid Ghanizade
More informationMismatched Multi-letter Successive Decoding for the Multiple-Access Channel
Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org
More informationOn Scalable Coding in the Presence of Decoder Side Information
On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,
More informationThe Byzantine CEO Problem
The Byzantine CEO Problem Oliver Kosut and ang Tong School of Electrical and Computer Engineering Cornell University, Ithaca, NY 14853 Email: {oek2,lt35}@cornelledu Abstract The CEO Problem is considered
More informationRate-Distortion Performance of Sequential Massive Random Access to Gaussian Sources with Memory
Rate-Distortion Performance of Sequential Massive Random Access to Gaussian Sources with Memory Elsa Dupraz, Thomas Maugey, Aline Roumy, Michel Kieffer To cite this version: Elsa Dupraz, Thomas Maugey,
More informationFeedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case
1 arxiv:0901.3580v1 [cs.it] 23 Jan 2009 Feedback Capacity of the Gaussian Interference Channel to Within 1.7075 Bits: the Symmetric Case Changho Suh and David Tse Wireless Foundations in the Department
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationLecture 5: Asymptotic Equipartition Property
Lecture 5: Asymptotic Equipartition Property Law of large number for product of random variables AEP and consequences Dr. Yao Xie, ECE587, Information Theory, Duke University Stock market Initial investment
More informationEfficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel
Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca
More informationJoint Source-Channel Coding for the Multiple-Access Relay Channel
Joint Source-Channel Coding for the Multiple-Access Relay Channel Yonathan Murin, Ron Dabora Department of Electrical and Computer Engineering Ben-Gurion University, Israel Email: moriny@bgu.ac.il, ron@ee.bgu.ac.il
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual
More informationQuantum rate distortion, reverse Shannon theorems, and source-channel separation
Quantum rate distortion, reverse Shannon theorems, and source-channel separation ilanjana Datta, Min-Hsiu Hsieh, Mark Wilde (1) University of Cambridge,U.K. (2) McGill University, Montreal, Canada Classical
More informationSubset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding
Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California - Santa Barbara {kumar,eakyol,rose}@ece.ucsb.edu
More information18.2 Continuous Alphabet (discrete-time, memoryless) Channel
0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationInformation Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem
Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE
More informationLecture 11: Polar codes construction
15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last
More informationOn the Capacity Region of the Gaussian Z-channel
On the Capacity Region of the Gaussian Z-channel Nan Liu Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 74 nkancy@eng.umd.edu ulukus@eng.umd.edu
More informationDistributed Lossy Interactive Function Computation
Distributed Lossy Interactive Function Computation Solmaz Torabi & John MacLaren Walsh Dept. of Electrical and Computer Engineering Drexel University Philadelphia, PA 19104 Email: solmaz.t@drexel.edu &
More informationA New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality
0 IEEE Information Theory Workshop A New Achievable Region for Gaussian Multiple Descriptions Based on Subset Typicality Kumar Viswanatha, Emrah Akyol and Kenneth Rose ECE Department, University of California
More informationCode-Based Cryptography Error-Correcting Codes and Cryptography
Code-Based Cryptography Error-Correcting Codes and Cryptography I. Márquez-Corbella 0 1. Error-Correcting Codes and Cryptography 1. Introduction I - Cryptography 2. Introduction II - Coding Theory 3. Encoding
More informationThe Capacity Region of the Gaussian Cognitive Radio Channels at High SNR
The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR 1 Stefano Rini, Daniela Tuninetti and Natasha Devroye srini2, danielat, devroye @ece.uic.edu University of Illinois at Chicago Abstract
More informationSimplified Composite Coding for Index Coding
Simplified Composite Coding for Index Coding Yucheng Liu, Parastoo Sadeghi Research School of Engineering Australian National University {yucheng.liu, parastoo.sadeghi}@anu.edu.au Fatemeh Arbabjolfaei,
More informationEnergy State Amplification in an Energy Harvesting Communication System
Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu
More informationAppendix B Information theory from first principles
Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes
More informationarxiv: v2 [cs.it] 28 May 2017
Feedback and Partial Message Side-Information on the Semideterministic Broadcast Channel Annina Bracher and Michèle Wigger arxiv:1508.01880v2 [cs.it] 28 May 2017 May 30, 2017 Abstract The capacity of the
More informationGraph Coloring and Conditional Graph Entropy
Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,
More informationRepresentation of Correlated Sources into Graphs for Transmission over Broadcast Channels
Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu
More informationOptimal matching in wireless sensor networks
Optimal matching in wireless sensor networks A. Roumy, D. Gesbert INRIA-IRISA, Rennes, France. Institute Eurecom, Sophia Antipolis, France. Abstract We investigate the design of a wireless sensor network
More informationVariable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More informationII. THE TWO-WAY TWO-RELAY CHANNEL
An Achievable Rate Region for the Two-Way Two-Relay Channel Jonathan Ponniah Liang-Liang Xie Department of Electrical Computer Engineering, University of Waterloo, Canada Abstract We propose an achievable
More informationarxiv: v1 [cs.it] 4 Jun 2018
State-Dependent Interference Channel with Correlated States 1 Yunhao Sun, 2 Ruchen Duan, 3 Yingbin Liang, 4 Shlomo Shamai (Shitz) 5 Abstract arxiv:180600937v1 [csit] 4 Jun 2018 This paper investigates
More informationSecond-Order Asymptotics for the Gaussian MAC with Degraded Message Sets
Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,
More informationOn the Capacity of the Interference Channel with a Relay
On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due
More informationThe Capacity Region for Multi-source Multi-sink Network Coding
The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.
More information5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006
5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 Source Coding With Limited-Look-Ahead Side Information at the Decoder Tsachy Weissman, Member, IEEE, Abbas El Gamal, Fellow,
More information