Network coding for multicast relation to compression and generalization of Slepian-Wolf
|
|
- Rodger Barker
- 5 years ago
- Views:
Transcription
1 Network coding for multicast relation to compression and generalization of Slepian-Wolf 1
2 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues Code construction for finite field multiple access networks 2
3 Distributed data compression Consider two correlated sources (X, Y ) p(x, y) that must be separately encoded for a user who wants to reconstruct both What information transmission rates from each source allow decoding with arbitrarily small probability of error? E.g. H(X 1 ) X 1. X 2. H(X2 X1 ) 3
4 Distributed source code A ((2 nr 1, 2 nr 2),n) distributed source code for joint source (X, Y ) consists of encoder maps and a decoder map f 1 : X n {1, 2,...,2 nr 1} n f 2 : Y {1, 2,...,2 nr 2} n g : {1, 2,...,2 nr 1} {1, 2,...,2 nr 2} X n Y - X n is mapped to f 1 (X n ) - Y n is mapped to f 2 (Y n ) -(R 1,R 2 ) is the rate pair of the code 4
5 Probability of error P e (n) =Pr{g(f1 (X n ),f 2 (Y n )) = ( X n,y n )}
6 Slepian-Wolf Definitions: A rate pair (R 1,R 2 ) is achievable if there exists a sequence of ((2 nr 1, 2 nr 2),n) distributed source codes with probability of error (n) 0as n P e achievable rate region - closure of the set of achievable rates Slepian-Wolf Theorem: 5
7 For the distributed source coding problem for source (X, Y ) drawn i.i.d. p(x, y), the achievable rate region is R 1 H(X Y ) R 2 H(Y X) R 1 + R 2 H(X, Y )
8 Proof of achievability Main idea: show that if the rate pair is in the Slepian-Wolf region, we can use a random binning encoding scheme with typical set decoding to obtain a probability of error that tends to zero Coding scheme: Source X assigns every sourceword x X n randomly among 2 nr 1 bins, and source Y independently assigns every y Y n randomly among 2 nr 2 bins Each sends the bin index corresponding to the message 6
9 the receiver decodes correctly if there is exactly one jointly typical sourceword pair corresponding to the received bin indexes, otherwise it declares an error
10 Random binning for single source compression An encoder that knows the typical set can compress a source X to H(X)+ɛ without loss, by employing separate codes for typical and atypical sequences Random binning is a way to compress a source X to H(X) + ɛ with asymptotically small probability of error without the encoder knowing the typical set, as well as the decoder knows the typical set the encoder maps each source sequence X n uniformly at random into one of 2 nr bins 7
11 the bin index, which is R bits long, forms the code the receiver decodes correctly if there is exactly one typical sequence corresponding to the received bin index
12 Error analysis An error occurs if: a) the transmitted sourceword is not typical, i.e. event (n) E 0 = {X / A ɛ } b) there exists another typical sourceword in the same bin, i.e.event E 1 = { x )= f(x), x (n) = X : f(x A ɛ } Use union of events bound: (n) P e = Pr(E 0 E 1 ) Pr(E 0 )+Pr(E 1 ) 8
13 Error analysis continued Pr(E 0 ) 0 by the Asymptotic Equipartition Property (AEP) Pr(E 1 ) = Pr{ x = x : f(x )= f(x), x x x x x (n) A ɛ (n) } (n) = A ɛ 2 nr x 2 nr 2 n(h(x)+ɛ) 0if R>H(X) x A ɛ Pr(f(x )= f(x)) 9
14 For sufficiently large n, Pr(E 0 ), Pr(E 1 ) < ɛ (n) P ɛ < 2ɛ
15 Jointly typical sequences (n) The set A ɛ of jointly typical sequences is the set of sequences (x, y) X n Y n with probability: 2 n(h(x)+ɛ) p X (x) 2 n(h(x) ɛ) 2 n(h(y )+ɛ) p Y (y) 2 n(h(y ) ɛ) 2 n(h(x,y )+ɛ) p X,Y (x, y) 2 n(h(x,y ) ɛ) for (X, Y) sequences of length n IID according to p X,Y (x, y) = i=1 p X,Y (x i,y i ) n Size of typical set: (n) 2 n(h(x,y )+ɛ) A ɛ 10
16 Proof: 1 = p(x, y) p(x, y) (n) A ɛ A ɛ (n) 2 n(h(x,y )+ɛ)
17 Conditionally typical sequences (n) The conditionally typical set A ɛ (X y) for a given typical y sequence is the set of x sequences that are jointly typical with the given y sequence. Size of conditionally typical set: (n) A ɛ (X y) 2 n(h(x Y )+ɛ) Proof: 11
18 For (x, y) A ( ɛ n) (X, Y ),. p(y) = 2 n(h(y )±ɛ). p(x, y) = 2 n(h(x,y )±ɛ) p(x y) = p(x, y) p(y). = 2 n(h(x Y )±2ɛ) Hence 1 p(x y) (n) x A ɛ (X y) (n) A ɛ 2 n(h(x Y )+2ɛ)
19 Proof of achievability error analysis Errors occur if: a) the transmitted sourcewords are not jointly typical, i.e. event (n) E 0 = {(X, Y ) / A ɛ } b) there exists another pair of jointly typical sourcewords in the same pair of bins, i.e. one or more of the following events E 1 = { x = X : f 1 (x )= f 1 (X), (x (n), Y) A ɛ } E 2 = { y )= f 2 (Y), (X, y (n) = Y : f2 (y ) A ɛ } E 12 = { (x, y ): x X, y Y,f 1 (x )= f 1 (X), f 2 (y )= f 2 (Y), (x, y ) A ( ɛ n) } 12
20 Use union of events bound: (n) P e = Pr(E 0 E 1 E 2 E 12 ) Pr(E 0 )+Pr(E 1 )+Pr(E 2 )+Pr(E 12 )
21 Error analysis continued Pr(E 0 ) 0 bythe AEP Pr(E 1 ) = Pr{ x x : f 1 (x )=f 1 (x), (x,y) (x, y) A (n) = (x,y) (x,y) 2 nr 1 2 x = x (x, y) A (n) ɛ ɛ } A (n) ɛ (X y) 2 nr 1 n(h(x Y )+2ɛ) 0ifR 1 >H(X Y ) Pr(f 1 (x )=f 1 (x)) 13
22 Similarly, Pr(E 2 ) 2 nr 2 2 n(h(y X)+2ɛ) 0if R 2 >H(Y X) Pr(E 12 ) 2 n(r 1+R 2 ) 2 n(h(x,y )+ɛ) 0if R 1 + R 2 >H(X, Y )
23 Error analysis continued Thus, if we are in the Slepian-Wolf rate region, for sufficiently large n, Pr(E 0 ), Pr(E 1 ), Pr(E 2 ), Pr(E 12 ) < ɛ (n) P ɛ < 4ɛ Since the average probability of error is less than 4ɛ, there exist at least one code (f 1,f 2,g ) with probability of error < 4ɛ. (n) Thus, there exists a sequence of codes with P ɛ 0. 14
24 Model for distributed network compression arbitrary directed graph with integer capacity links discrete memoryless source processes with integer bit rates randomized linear network coding over vectors of bits in F 2 coefficients of overall combination transmitted to receivers receivers perform minimum entropy or maximum a posteriori probability decoding 15
25 Distributed compression problem Consider two sources of bit rates r 1,r 2, whose output values in each unit time period are drawn i.i.d. from the same joint distribution Q linear network coding in F 2 over vectors of nr 1 and nr 2 bits from each source respectively Define 16
26 m 1 and m 2 the minimum cut capacities between the receiver and each source respectively m 3 the minimum cut capacity between the receiver and both sources L the maximum source-receiver path length
27 Theorem 1 The error probability at each receiver using minimum entropy or maximum a posteriori probability decoding is at most i 3 =1 p e i,where { ( p 1 e exp n min D(P X1 X 2 Q) X 1,X 2 ) } m 1 (1 log L) H(X 1 X 2 ) +2 2r 1+r 2 log(n +1) n { ( p 2 e exp n min D(P X1 X 2 Q) X 1,X 2 ) } m 2 (1 log L) H(X 2 X 1 ) +2 r 1+2r 2 log(n +1) n { ( p 3 e exp n min D(P X1 X 2 Q) X 1,X 2 ) } + m 3 (1 1 + log L) H(X 1 X 2 ) +2 2r 1+2r 2 log(n +1) n 17
28 Distributed compression Redundancy is removed or added in different parts of the network depending on available capacity Achieved without knowledge of source entropy rates at interior network nodes For the special case of a Slepian-Wolf source network consisting of a link from each source to the receiver, the network coding error exponents reduce to known error exponents for linear Slepian-Wolf coding [Csi82] 18
29 Proof outline Error probability 3 i =1 pi e,where p1 e is the probability of correctly decoding X 2 but not X 1, p2 e is the probability of correctly decoding X 1 but not X 2 p3 e is the probability of wrongly decoding X 1,X 2 Proof approach using method of types similar to that in [Csi82] Types P xi, joint types P xy are the empirical distributions of elements in vectors x i 19
30 Proof outline (cont d) Bound error probabilities by summing over sets of joint types {P X 1 = X 1, X 2 = X 2 } i = 1 X 1 X 1X 2 X 2 P i n = {P X X 1 = X 1, X 2 X 2 } i = 2 1 X 1X 2 X 2 {P X X 2 = X 2 } 1 X 1X X 2 X 2 1 = X 1, i = 3 nr i where Xi, X i F 2 20
31 sequences of each type T X1 X 2 = [ x y { } n(r ] F 1 +r 2 ) 2 Pxy = P X1 X 2 { n(r 1 +r 2 ) T X 1X 2 X 1 X 2 (xy) = [ x ỹ ] F 2 P x yxy = P X 1X 2X 1 X 2 }
32 Proof outline (cont d) Define P i,i =1, 2, the probability that distinct (x, y), ( x, y), where x = x, at the receiver P 3, the probability that (x, y), ( x, y), where x = x,y = y, are mapped to the same output at the receiver These probabilities can be calculated for a given network, or bounded in terms of block length n and network parameters 21
33 Proof outline (cont d) A link with 1 nonzero incoming signal carries the zero signal with probability 2 1 nc, where c is the link capacity this is equal to the probability that a pair of distinct input values are mapped to the same output on the link We can show by induction on the minimum cut capacities m i that ( P i 1 (1 1 ) mi ) L ( ) L mi 2 n 2 n 22
34 Proof outline (cont d) We substitute in cardinality bounds P 1 < (n +1) 22r 1 +r 2 n P 2 < (n +1) 2r 1 +2r 2 n P 3 < (n +1) 22r 1 +2r 2 n T X1 X 2 exp{nh(x 1 X 2 )} T X 1X 2 X 1 X (xy) exp{nh(x 1X 2 X 2 1 X 2 )} probability of source vector of type (x, y) T X1 X 2 Q n (xy) = exp{ n(d(p X1 X 2 Q)+ H(X 1 X 2 ))} 23
35 Proof outline (cont d) and the decoding conditions minimum entropy decoder: H(X 1X 2) H(X 1 X 2 ) maximum a posteriori probability decoder: D(P Q)+ H(X 1X 2) D(P X1 X 2 Q)+ H(X 1 X 2 ) X 1 X 2 to obtain the result 24
36 Conclusions Distributed randomized network coding can achieve distributed compression of correlated sources Error exponents generalize results for linear Slepian Wolf coding Further work: investigation of non-uniform code distributions, other types of codes, and other decoding schemes 25
37 MIT OpenCourseWare Information Theory Spring 2010 For information about citing these materials or our Terms of Use, visit:
LECTURE 13. Last time: Lecture outline
LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to
More informationEE5585 Data Compression May 2, Lecture 27
EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,
More informationLecture 3: Channel Capacity
Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationLECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem
LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser
More informationThe Capacity Region for Multi-source Multi-sink Network Coding
The Capacity Region for Multi-source Multi-sink Network Coding Xijin Yan Dept. of Electrical Eng. - Systems University of Southern California Los Angeles, CA, U.S.A. xyan@usc.edu Raymond W. Yeung Dept.
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationLecture 15: Conditional and Joint Typicaility
EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of
More informationLECTURE 10. Last time: Lecture outline
LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationDistributed Source Coding Using LDPC Codes
Distributed Source Coding Using LDPC Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete May 29, 2010 Telecommunications Laboratory (TUC) Distributed Source Coding
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationA Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels
A Graph-based Framework for Transmission of Correlated Sources over Multiple Access Channels S. Sandeep Pradhan a, Suhan Choi a and Kannan Ramchandran b, a {pradhanv,suhanc}@eecs.umich.edu, EECS Dept.,
More informationEE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions
EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationLecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity
5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke
More informationDistributed Lossless Compression. Distributed lossless compression system
Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf
More informationInformation Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results
Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel
More informationEE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.
EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationMARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for
MARKOV CHAINS A finite state Markov chain is a sequence S 0,S 1,... of discrete cv s from a finite alphabet S where q 0 (s) is a pmf on S 0 and for n 1, Q(s s ) = Pr(S n =s S n 1 =s ) = Pr(S n =s S n 1
More informationDistributed Functional Compression through Graph Coloring
Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology
More informationSOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003
SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationThe Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan
The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting
More informationFrans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)
Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication
More informationSolutions to Homework Set #1 Sanov s Theorem, Rate distortion
st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence
More information4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information
4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk
More informationLecture 11: Quantum Information III - Source Coding
CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that
More informationUniversal Incremental Slepian-Wolf Coding
Proceedings of the 43rd annual Allerton Conference, Monticello, IL, September 2004 Universal Incremental Slepian-Wolf Coding Stark C. Draper University of California, Berkeley Berkeley, CA, 94720 USA sdraper@eecs.berkeley.edu
More informationX 1 : X Table 1: Y = X X 2
ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access
More informationOn Multiple User Channels with State Information at the Transmitters
On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu
More informationInteractive Communication for Data Exchange
Interactive Communication for Data Exchange Himanshu Tyagi Indian Institute of Science, Bangalore Joint work with Pramod Viswanath and Shun Watanabe The Data Exchange Problem [ElGamal-Orlitsky 84], [Csiszár-Narayan
More informationSource and Channel Coding for Correlated Sources Over Multiuser Channels
Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More informationLecture 22: Final Review
Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information
More informationLecture 11: Continuous-valued signals and differential entropy
Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components
More informationLecture Notes for Communication Theory
Lecture Notes for Communication Theory February 15, 2011 Please let me know of any errors, typos, or poorly expressed arguments. And please do not reproduce or distribute this document outside Oxford University.
More informationCommunications Theory and Engineering
Communications Theory and Engineering Master's Degree in Electronic Engineering Sapienza University of Rome A.A. 2018-2019 AEP Asymptotic Equipartition Property AEP In information theory, the analog of
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationNational University of Singapore Department of Electrical & Computer Engineering. Examination for
National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:
More information(Classical) Information Theory II: Source coding
(Classical) Information Theory II: Source coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract The information content of a random variable
More informationRepresentation of Correlated Sources into Graphs for Transmission over Broadcast Channels
Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu
More informationExchanging Third-Party Information in a Network
David J. Love 1 Bertrand M. Hochwald Kamesh Krishnamurthy 1 1 School of Electrical and Computer Engineering Purdue University Beceem Communications ITA Workshop, 007 Information Exchange Some applications
More informationOn Large Deviation Analysis of Sampling from Typical Sets
Communications and Signal Processing Laboratory (CSPL) Technical Report No. 374, University of Michigan at Ann Arbor, July 25, 2006. On Large Deviation Analysis of Sampling from Typical Sets Dinesh Krithivasan
More informationSlepian-Wolf Code Design via Source-Channel Correspondence
Slepian-Wolf Code Design via Source-Channel Correspondence Jun Chen University of Illinois at Urbana-Champaign Urbana, IL 61801, USA Email: junchen@ifpuiucedu Dake He IBM T J Watson Research Center Yorktown
More informationEECS 750. Hypothesis Testing with Communication Constraints
EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.
More informationQuantum Information Chapter 10. Quantum Shannon Theory
Quantum Information Chapter 10. Quantum Shannon Theory John Preskill Institute for Quantum Information and Matter California Institute of Technology Updated January 2018 For further updates and additional
More informationHypothesis Testing with Communication Constraints
Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline
More informationCases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem
Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem Soheil Feizi, Muriel Médard RLE at MIT Emails: {sfeizi,medard}@mit.edu Abstract In this paper, we
More informationOn Capacity Under Received-Signal Constraints
On Capacity Under Received-Signal Constraints Michael Gastpar Dept. of EECS, University of California, Berkeley, CA 9470-770 gastpar@berkeley.edu Abstract In a world where different systems have to share
More informationNetwork Coding on Directed Acyclic Graphs
Network Coding on Directed Acyclic Graphs John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 0 Reference These notes are directly derived from Chapter of R. W. Yeung s Information
More informationOn Function Computation with Privacy and Secrecy Constraints
1 On Function Computation with Privacy and Secrecy Constraints Wenwen Tu and Lifeng Lai Abstract In this paper, the problem of function computation with privacy and secrecy constraints is considered. The
More informationNoisy channel communication
Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University
More informationSource Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime
Source Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime Solmaz Torabi Dept. of Electrical and Computer Engineering Drexel University st669@drexel.edu Advisor:
More informationLecture 11: Polar codes construction
15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last
More informationSource Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria
Source Coding Master Universitario en Ingeniería de Telecomunicación I. Santamaría Universidad de Cantabria Contents Introduction Asymptotic Equipartition Property Optimal Codes (Huffman Coding) Universal
More informationInformation Theory. Coding and Information Theory. Information Theory Textbooks. Entropy
Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is
More informationLecture 1: Introduction, Entropy and ML estimation
0-704: Information Processing and Learning Spring 202 Lecture : Introduction, Entropy and ML estimation Lecturer: Aarti Singh Scribes: Min Xu Disclaimer: These notes have not been subjected to the usual
More informationReliable Computation over Multiple-Access Channels
Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,
More informationA Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers
A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers Peiyu Tan and Jing Li (Tiffany) Electrical and Computer Engineering Dept, Lehigh
More informationSolutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality
1st Semester 2010/11 Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality 1. Convexity of capacity region of broadcast channel. Let C R 2 be the capacity region of
More informationLossy Distributed Source Coding
Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,
More informationInteractive Decoding of a Broadcast Message
In Proc. Allerton Conf. Commun., Contr., Computing, (Illinois), Oct. 2003 Interactive Decoding of a Broadcast Message Stark C. Draper Brendan J. Frey Frank R. Kschischang University of Toronto Toronto,
More informationCoding for Computing. ASPITRG, Drexel University. Jie Ren 2012/11/14
Coding for Computing ASPITRG, Drexel University Jie Ren 2012/11/14 Outline Background and definitions Main Results Examples and Analysis Proofs Background and definitions Problem Setup Graph Entropy Conditional
More informationElectrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7
Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling
More informationOn Scalable Coding in the Presence of Decoder Side Information
On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,
More informationA Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding
A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation
More informationExercise 1. = P(y a 1)P(a 1 )
Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a
More informationInformation Theory and Statistics, Part I
Information Theory and Statistics, Part I Information Theory 2013 Lecture 6 George Mathai May 16, 2013 Outline This lecture will cover Method of Types. Law of Large Numbers. Universal Source Coding. Large
More informationHomework Set #2 Data Compression, Huffman code and AEP
Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code
More informationLecture 5: Asymptotic Equipartition Property
Lecture 5: Asymptotic Equipartition Property Law of large number for product of random variables AEP and consequences Dr. Yao Xie, ECE587, Information Theory, Duke University Stock market Initial investment
More informationfor some error exponent E( R) as a function R,
. Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential
More informationMultiterminal Source Coding with an Entropy-Based Distortion Measure
Multiterminal Source Coding with an Entropy-Based Distortion Measure Thomas Courtade and Rick Wesel Department of Electrical Engineering University of California, Los Angeles 4 August, 2011 IEEE International
More informationEE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018
Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code
More informationEntropies & Information Theory
Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information
More informationLecture 2. Capacity of the Gaussian channel
Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN
More information16.36 Communication Systems Engineering
MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication
More informationNote that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )).
l I ~-16 / (a) (5 points) What is the capacity Cr of the channel X -> Y? What is C of the channel Y - Z? (b) (5 points) What is the capacity C 3 of the cascaded channel X -3 Z? (c) (5 points) A ow let.
More informationGraph Coloring and Conditional Graph Entropy
Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,
More information9 Forward-backward algorithm, sum-product on factor graphs
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 9 Forward-backward algorithm, sum-product on factor graphs The previous
More informationNotes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel
Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic
More informationLecture 14 February 28
EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables
More information3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions
Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the
More informationInformation Theory: Entropy, Markov Chains, and Huffman Coding
The University of Notre Dame A senior thesis submitted to the Department of Mathematics and the Glynn Family Honors Program Information Theory: Entropy, Markov Chains, and Huffman Coding Patrick LeBlanc
More informationSecret Key and Private Key Constructions for Simple Multiterminal Source Models
Secret Key and Private Key Constructions for Simple Multiterminal Source Models arxiv:cs/05050v [csit] 3 Nov 005 Chunxuan Ye Department of Electrical and Computer Engineering and Institute for Systems
More informationA Comparison of Two Achievable Rate Regions for the Interference Channel
A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg
More informationSecond-Order Asymptotics in Information Theory
Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015
More informationInformation Theory and Coding Techniques: Chapter 1.1. What is Information Theory? Why you should take this course?
Information Theory and Coding Techniques: Chapter 1.1 What is Information Theory? Why you should take this course? 1 What is Information Theory? Information Theory answers two fundamental questions in
More informationAn introduction to basic information theory. Hampus Wessman
An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on
More informationEE 376A: Information Theory Lecture Notes. Prof. Tsachy Weissman TA: Idoia Ochoa, Kedar Tatwawadi
EE 376A: Information Theory Lecture Notes Prof. Tsachy Weissman TA: Idoia Ochoa, Kedar Tatwawadi January 6, 206 Contents Introduction. Lossless Compression.....................................2 Channel
More informationCommon Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014
Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2
More informationShannon s A Mathematical Theory of Communication
Shannon s A Mathematical Theory of Communication Emre Telatar EPFL Kanpur October 19, 2016 First published in two parts in the July and October 1948 issues of BSTJ. First published in two parts in the
More information