Advanced Topics in Information Theory

Size: px
Start display at page:

Download "Advanced Topics in Information Theory"

Transcription

1 Advanced Topics in Information Theory Lecture Notes Stefan M. Moser c Copyright Stefan M. Moser Signal and Information Processing Lab ETH Zürich Zurich, Switzerland Institute of Communications Engineering National Chiao Tung University (NCTU) Hsinchu, Taiwan You are welcome to use these lecture notes for yourself, for teaching, or for any other noncommercial purpose. If you use extracts from these lecture notes, please make sure to show their origin. The author assumes no liability or responsibility for any errors or omissions. 3 rd Edition Version 3.3. Compiled on 12 December For the latest version see

2 E 2 E 1 B 2 B 3 B 1 B 5 B6 B 4 B 7 E 3

3 v 1 v 5 v 6 v 2 v v 4 v 3

4 p a p p b

5 p

6 y 1 1 x y 1 x

7 y y x x

8 Q F Q P(X) Q

9 q q q

10 q > 90 degrees q q

11 Q > 90 degrees Q Q F

12 Q > 90 degrees Q Q Q F

13 Q Q Q F

14 I II Q 2 Q 1 III M M c

15 F A Q P(X) Q

16 Q Q A F

17 Q Q A F

18 V U W

19 Error exponents R 0 E G (R) E B (R) 0 R crit R 0 C R

20 E 0 (1,Q) E 0 ( 1 2,Q) E 0 ( 1 3,Q) E 0 ( 1 4,Q) max {E 0(ρ,Q) ρr} 0 ρ 1 E 0 ( 1 8,Q) R

21 Error exponents E SP (R) R 0 E G (R) E (R) 0 R crit C R

22 Error exponents Error exponents R 0 E SP (R) R 0 E SP (R) E G (R) E G (R) E (R) E (R) 0 R R crit C R 0 R crit R C R

23 E G (R) R 0 E G (R) R 0 R crit R 0 C R C R

24 C I (E s ) line above C I (E s ) here a discontinuity is theoretically possible E s

25 Destination ˆM Dec. ψ n FB θ k Y k DMC Q Y X X k Enc. φ k M Uniform Source F k+1 Delay F k

26 1 δ 0 0 X δ δ? Y 1 1 δ 1

27 independent description 49 points dependent description 45 points

28

29 ˆx 1 ˆx 2 θ

30

31 1 D 0 0 ˆX D D D X

32 R I (D) here a discontinuity is theoretically possible line below R I (D) D

33 rate information rate distortion function (with discontinuity) R I (D) R I (D+ǫ) D D+ǫ distortion

34 rate λ E [ d(x, ˆX) ] point ( E [ d(x, ˆX) ],I(X; ˆX) ) line of slope λ I(X; ˆX) E [ d(x, ˆX) ] distortion

35 rate R 0 (q,λ) R( ) λ E q [ d(x, ˆX) ] = λd I q (X; ˆX) = R(D) D distortion

36 rate R( ) R 0 (q,λ) achievable by q D distortion

37 rate R 0 (q,λ) R 0 (q,λ) R( ) achievable by q achievable by q distortion

38 R(D) R 0 (q,λ 1 ) R 0 (q,λ 2 ) slope discontinuities two different tangents with slopes λ 1 and λ 2 D

39 R(D) D

40 Û 1,...,ÛK Dest. Decoder Y 1,...,Y n X 1,...,X n U 1,...,U K DMC Encoder DMS

41 Encoder V 1...,V K Lossy Compressor U 1...,U K Binary DMS Destination ˆV 1,..., ˆV K Decoder Y 1,...,Y n DMC X 1,...,X n

42 DMC X 1,...,X n Channel Encoder W RD Encoder U 1...,U K Binary DMS Destination V 1,...,V K RD Decoder Ŵ Channel Decoder Y 1,...,Y n

43

44 σ 2 i σ 2 4 σ 2 1 σ 2 5 λ σ 2 7 σ 2 2 D 1 σ 2 3 D 4 D 5 D 7 D 2 D 3 σ 2 6 D 6 X 1 X 2 X 3 X 4 X 5 X 6 X 7

45 ˆx x x ˆx

46 the sources Q that do not work because for the given R and D: R( Q,D) > R inf D( Q Q) = D P(X) Q

47 W (1) Enc. φ (1) n Destination ˆX 1,..., ˆX n Dec. ψ (i) n X 1,...,X n Q W (2) Enc. φ (2) n

48 ˆX (2) Q ˆX(1), ˆX (2) X (, 0) 0 1 Q ˆX(1) X ( 0) ˆX (1) Q ˆX(2) X ( 0) ˆX (2) Q ˆX(1), ˆX (2) X (, 1) 0 1 Q ˆX(1) X ( 1) ˆX (1) Q ˆX(2) X ( 1) 0 1

49 ˆX (2) Q ˆX(1), ˆX (2) (, ) 0 1 Q ˆX(1)( ) ˆX (1) 0 1 Q ˆX(2)( )

50 Destination ˆX 1,..., ˆX n Dec. ψ n W Enc. φ n X 1,...,X n Q X,Y Y 1,...,Y n

51 codeword 1 codeword e nr bin 1 bin 2 bin 3 bin ( e nr 1 ) bin e nr

52 Destination ˆX 1,..., ˆX n Dec. ψ n W Enc. φ n X 1,...,X n BSS Y 1,...,Y n p p 1 p 1 p

53 W (1) Enc. φ (1) n X 1,...,X n Destination ˆX 1,..., ˆX n Ŷ 1,...,Ŷn Dec. ψ n W (2) Enc. φ (2) n Y 1,...,Y n Q X,Y

54 R (2) H(X,Y) separate compression and decompression H(Y) H(Y X) joint encoding H(X Y) H(X) H(X, Y) R (1)

55 Hsinchu X Taichung Y Q X,Y (, ) rain sun Hsinchu total rain sun Taichung total

56 Destination ˆM (1) ˆM (2) Dec. ψ n Y Channel Q Y X (1),X (2) X (1) X (2) Enc. φ (1) n Enc. φ (2) n M (1) M (2) Uniform Source 1 Uniform Source 2

57 0 X (1) 1 1 ǫ 1 ǫ 1 ǫ ǫ 1 1 Y 1 ǫ 2 2 X (2) 0 ǫ 2 ǫ 2 1 ǫ 2 3 1

58 R (2) C (2) C (1) R (1)

59 R (2) 1 1 R (1)

60 1 2 0 X (2) Y 1 2 2

61 R (2) R (1)

62 R (2) I 3 I 2 D C I 1 0 E 0 B A R (1)

63 R (2) R (1)

64 R (2) I ( X (2) ;Y X (1)) D C I ( X (2) ;Y ) E I ( X (1) ;Y ) B A I ( X (1) ;Y X (2) ) R (1)

65 R (2) convex hull of C a C b C b C a R (1)

66 R (2) C [ϑ] R (1)

67 R (2) R (2) 20 inactive constraint C a 10 C b R (1) R (1)

68 R (2) inactive constraint C [1 2] R (1)

69 R (2) ( [ϑ]) 0,I 2 D C ( [ϑ] ) I 3 I[ϑ] 2,I[ϑ] 2 E (0,0) B A ( [ϑ] I 1,0) ( [ϑ] ) I 1,I[ϑ] 3 I[ϑ] 1 R (1)

70 R (2) C D B E A R (1)

71 R (2) ( ) C E (2) σ 2 C ( ) R (1) +R (2) = C E (1) +E (2) σ 2 ( ) C E (2) E (1) +σ 2 B ( ) C E (1) E (2) +σ 2 ( ) C E (1) σ 2 R (1)

72 R (2) E E R (1)

73 R (2) α = 0 TDMA for α from 0 to 1 α = E(1) E (1) +E (2) α = 1 R (1)

74 Destination Û n 1 ˆV n 1 Dec. ψ Y n 1 MAC Q Y X (1),X (2) { (1)} n X k 1 { (2)} n X k 1 Enc. φ (1) Enc. φ (2) U n 1 V n 1 Q U,V

75 Encoder X (1) MAC Enc. 1 W (1) SW Enc. 1 U MAC Q Y X (1),X (2) Q U,V X (2) MAC Enc. 2 W (2) SW Enc. 2 V Destination Û ˆV SW Dec. Ŵ (1) Ŵ (2) Decoder MAC Dec. Y

76 U V

77 Q S S S Destination ˆM Dec. ψ Y Q Y X,S X Enc. φ M Uniform Source

78 1 q 0 0 X 1 q q Y 1 q 1

79 q 0 0 X q 1 q Y q

80 1 p 0 0 X p p Y p

81 λ α α 1 2 S 1 2λ 2 U 1 α λ α 1 1 2

82 Dest. 1 Dest. 2 ˆM (1) ˆM (0) ˆM (0) ˆM (2) Dec. ψ (1) Dec. ψ (2) Y (1) Y (2) BC Q Y (1),Y (2) X X Enc. φ M (1) M (0) M (2) Uniform Source 1 Uniform Source 0 Uniform Source 2

83 Q Y (2) Y (1) Y (1) Q Y (1) X X Y (2) Y (1)

84 V Z (1) + Y (1) + X Y (2) Y (1)

85 codeword X cloud center U

86 R (1) I ( X;Y (2)) +I ( X;Y (1) U ) R (1) = R (0) +I ( U;Y (2)) +I ( X;Y (1) U ) I ( X;Y (1) U ) I ( U;Y (2)) R (0)

87 R (2) = R (1) +I ( U (1) ;Y (1)) +I ( U (2) ;Y (2)) I ( U (1) ;U (2)) R (2) R (1) = I ( U (1) ;Y (1)) I ( U (2) ;Y (2)) A R (2) = I ( U (2) ;Y (2)) I ( U (2) ;Y (2)) I ( U (1) ;U (2)) B I ( U (1) ;Y (1)) I ( U (1) ;U (2)) I ( U (1) ;Y (1)) R (1)

88 X (1) f GP U (1) GP Enc. M (1) X f U (2) U (2) U (2) U (2) f: U (1) U (2) X Enc. 2 M (2)

89 R (0) +R (2) ( ) 1 2 log 1+ E σ(2) 2 α = 0 time-sharing α = 1 ( ) 1 2 log 1+ E σ(1) 2 R (1)

90 ˆM (1) X (1) Enc. φ (1) M (1) Uniform Source 1 Destination ˆM (0) ˆM (2) Dec. ψ Y MAC Q Y X (1),X (2) M (0) Uniform Source 0 X (2) Enc. φ (2) M (2) Uniform Source 2

91 R (0) R (2) R (1)

92 M (1) Terminal 1 M (2) ˆM (3) (2) X (1) ˆM (1) (5) ˆM(2) (5) Terminal 2 X (2) Y (5) Terminal 5 Y (2) DMN Channel ˆM (3) (5) ˆM(4) (5) X (3) Y (3) Y (4) X (4) Terminal 3 M (3) M (4) Terminal 4 ˆM (2) (4)

93 ˆM (1) ˆM (0) (1) ˆM (0) (2) ˆM (2) Terminal 1 Terminal 2 Y (1) Y (2) Broadcast Channel Q Y (1),Y (2) X (3) S 1 S 2 X (3) S 3 Terminal 3 M (1) M (0) M (2)

94 ˆM (1) ˆM (2) Terminal 3 Y (3) S 3 MAC Q Y (3) X (1),X (2) S 1 X (1) X (2) S 2 Terminal 1 Terminal 2 M (1) M (2)

95 S 2 Terminal 2 X (2) Y (2) S 1 ˆM Terminal 3 Y (3) Q Y (2),Y (3) X (1),X (2) X (1) Terminal 1 M

96 S 2 Terminal 2 S 4 X (2) Y (2) S 1 ˆM Terminal 4 Y (4) Q Y (2),Y (3),Y (4) X (1),X (2),X (3) X (1) Terminal 1 M X (3) Y (3) S 3 Terminal 3

97 Dest. 1 ˆM (1) Dec. ψ (1) Y (1) IC X (1) Enc. φ (1) M (1) Uniform Source 1 Dest. 2 ˆM (2) Dec. ψ (2) Y (2) Q Y (1),Y (2) X (1),X (2) X (2) Enc. φ (2) M (2) Uniform Source 2

98 1 ǫ X (1) Y (1) ǫ 1 ǫ ǫ 1 1 ǫ X (2) Y (2) ǫ 2 ǫ ǫ 2

99 R (2) C (2) C (1) R (1)

100 R (2) C (2) C (1) R (1)

101 ˆM (1) Terminal 3 Y (1) X (1) Terminal 1 M (1) IC Q Y (1),Y (2) X (1),X (2) ˆM (2) Terminal 4 Y (2) X (2) Terminal 2 M (2)

102 1.6 a 12 = 0.15, a 21 = a 12 = 0.35, a 21 = R (2) [bits] R (2) [bits] R (1) [bits] R (1) [bits] a 12 = 0.55, a 21 = a 12 = 0.85, a 21 = R (2) [bits] R (2) [bits] R (1) [bits] R (1) [bits] a 12 = 1.15, a 21 = a 12 = 2.15, a 21 = R (2) [bits] R (2) [bits] R (1) [bits] R (1) [bits]

103 R (2) [bits] R (1) +2R (2) 7.09 bits R (2) 3.26 bits R (1) +R (2) 4.19 bits 2R (1) +R (2) 7.09 bits R (1) 3.26 bits R (1) [bits] 3 3.5

104 d sym weak medium strong very strong a

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

Lecture 4 Channel Coding

Lecture 4 Channel Coding Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Distributed Lossless Compression. Distributed lossless compression system

Distributed Lossless Compression. Distributed lossless compression system Lecture #3 Distributed Lossless Compression (Reading: NIT 10.1 10.5, 4.4) Distributed lossless source coding Lossless source coding via random binning Time Sharing Achievability proof of the Slepian Wolf

More information

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

TTIC 31230, Fundamentals of Deep Learning David McAllester, April Information Theory and Distribution Modeling

TTIC 31230, Fundamentals of Deep Learning David McAllester, April Information Theory and Distribution Modeling TTIC 31230, Fundamentals of Deep Learning David McAllester, April 2017 Information Theory and Distribution Modeling Why do we model distributions and conditional distributions using the following objective

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5

UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, Solutions to Homework Set #5 UCSD ECE 255C Handout #14 Prof. Young-Han Kim Thursday, March 9, 2017 Solutions to Homework Set #5 3.18 Bounds on the quadratic rate distortion function. Recall that R(D) = inf F(ˆx x):e(x ˆX)2 DI(X; ˆX).

More information

ECE Information theory Final (Fall 2008)

ECE Information theory Final (Fall 2008) ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1

More information

EE5585 Data Compression May 2, Lecture 27

EE5585 Data Compression May 2, Lecture 27 EE5585 Data Compression May 2, 2013 Lecture 27 Instructor: Arya Mazumdar Scribe: Fangying Zhang Distributed Data Compression/Source Coding In the previous class we used a H-W table as a simple example,

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying

A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying A Half-Duplex Cooperative Scheme with Partial Decode-Forward Relaying Ahmad Abu Al Haija, and Mai Vu, Department of Electrical and Computer Engineering McGill University Montreal, QC H3A A7 Emails: ahmadabualhaija@mailmcgillca,

More information

On Optimum Conventional Quantization for Source Coding with Side Information at the Decoder

On Optimum Conventional Quantization for Source Coding with Side Information at the Decoder On Optimum Conventional Quantization for Source Coding with Side Information at the Decoder by Lin Zheng A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the

More information

Multiterminal Source Coding with an Entropy-Based Distortion Measure

Multiterminal Source Coding with an Entropy-Based Distortion Measure Multiterminal Source Coding with an Entropy-Based Distortion Measure Thomas Courtade and Rick Wesel Department of Electrical Engineering University of California, Los Angeles 4 August, 2011 IEEE International

More information

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding...

List of Figures. Acknowledgements. Abstract 1. 1 Introduction 2. 2 Preliminaries Superposition Coding Block Markov Encoding... Contents Contents i List of Figures iv Acknowledgements vi Abstract 1 1 Introduction 2 2 Preliminaries 7 2.1 Superposition Coding........................... 7 2.2 Block Markov Encoding.........................

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen) UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, 2017 Solutions to Take-Home Midterm (Prepared by Pinar Sen) 1. (30 points) Erasure broadcast channel. Let p(y 1,y 2 x) be a discrete

More information

Quiz 2 Date: Monday, November 21, 2016

Quiz 2 Date: Monday, November 21, 2016 10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,

More information

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Sparse Regression Codes for Multi-terminal Source and Channel Coding Sparse Regression Codes for Multi-terminal Source and Channel Coding Ramji Venkataramanan Yale University Sekhar Tatikonda Allerton 2012 1 / 20 Compression with Side-Information X Encoder Rate R Decoder

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

Achievable Rates and Outer Bound for the Half-Duplex MAC with Generalized Feedback

Achievable Rates and Outer Bound for the Half-Duplex MAC with Generalized Feedback 1 Achievable Rates and Outer Bound for the Half-Duplex MAC with Generalized Feedback arxiv:1108.004v1 [cs.it] 9 Jul 011 Ahmad Abu Al Haija and Mai Vu, Department of Electrical and Computer Engineering

More information

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability

More information

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

18.2 Continuous Alphabet (discrete-time, memoryless) Channel 0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not

More information

Interactive Hypothesis Testing with Communication Constraints

Interactive Hypothesis Testing with Communication Constraints Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October - 5, 22 Interactive Hypothesis Testing with Communication Constraints Yu Xiang and Young-Han Kim Department of Electrical

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016 Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided).

More information

The Fading Number of a Multiple-Access Rician Fading Channel

The Fading Number of a Multiple-Access Rician Fading Channel The Fading Number of a Multiple-Access Rician Fading Channel Intermediate Report of NSC Project Capacity Analysis of Various Multiple-Antenna Multiple-Users Communication Channels with Joint Estimation

More information

Hypothesis Testing with Communication Constraints

Hypothesis Testing with Communication Constraints Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline

More information

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003 SLEPIAN-WOLF RESULT { X i} RATE R x ENCODER 1 DECODER X i V i {, } { V i} ENCODER 0 RATE R v Problem: Determine R, the

More information

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence

More information

Reliable Computation over Multiple-Access Channels

Reliable Computation over Multiple-Access Channels Reliable Computation over Multiple-Access Channels Bobak Nazer and Michael Gastpar Dept. of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA, 94720-1770 {bobak,

More information

Capacity Bounds for Diamond Networks

Capacity Bounds for Diamond Networks Technische Universität München Capacity Bounds for Diamond Networks Gerhard Kramer (TUM) joint work with Shirin Saeedi Bidokhti (TUM & Stanford) DIMACS Workshop on Network Coding Rutgers University, NJ

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University

C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University Quantization C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University http://www.csie.nctu.edu.tw/~cmliu/courses/compression/ Office: EC538 (03)5731877 cmliu@cs.nctu.edu.tw

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

Channel combining and splitting for cutoff rate improvement

Channel combining and splitting for cutoff rate improvement Channel combining and splitting for cutoff rate improvement Erdal Arıkan Electrical-Electronics Engineering Department Bilkent University, Ankara, 68, Turkey Email: arikan@eebilkentedutr arxiv:cs/5834v

More information

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets

Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Second-Order Asymptotics for the Gaussian MAC with Degraded Message Sets Jonathan Scarlett and Vincent Y. F. Tan Department of Engineering, University of Cambridge Electrical and Computer Engineering,

More information

On Capacity Under Received-Signal Constraints

On Capacity Under Received-Signal Constraints On Capacity Under Received-Signal Constraints Michael Gastpar Dept. of EECS, University of California, Berkeley, CA 9470-770 gastpar@berkeley.edu Abstract In a world where different systems have to share

More information

The Unbounded Benefit of Encoder Cooperation for the k-user MAC

The Unbounded Benefit of Encoder Cooperation for the k-user MAC The Unbounded Benefit of Encoder Cooperation for the k-user MAC Parham Noorzad, Student Member, IEEE, Michelle Effros, Fellow, IEEE, and Michael Langberg, Senior Member, IEEE arxiv:1601.06113v2 [cs.it]

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

Exercises with solutions (Set D)

Exercises with solutions (Set D) Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

A new converse in rate-distortion theory

A new converse in rate-distortion theory A new converse in rate-distortion theory Victoria Kostina, Sergio Verdú Dept. of Electrical Engineering, Princeton University, NJ, 08544, USA Abstract This paper shows new finite-blocklength converse bounds

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

Lecture 9 Polar Coding

Lecture 9 Polar Coding Lecture 9 Polar Coding I-Hsiang ang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 29, 2015 1 / 25 I-Hsiang ang IT Lecture 9 In Pursuit of Shannon s Limit Since

More information

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12 Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user

More information

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding Homework Set # Rates definitions, Channel Coding, Source-Channel coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits

More information

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma

More information

A Comparison of Superposition Coding Schemes

A Comparison of Superposition Coding Schemes A Comparison of Superposition Coding Schemes Lele Wang, Eren Şaşoğlu, Bernd Bandemer, and Young-Han Kim Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA

More information

Analyzing Large Communication Networks

Analyzing Large Communication Networks Analyzing Large Communication Networks Shirin Jalali joint work with Michelle Effros and Tracey Ho Dec. 2015 1 The gap Fundamental questions: i. What is the best achievable performance? ii. How to communicate

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Distributed Hypothesis Testing Over Discrete Memoryless Channels

Distributed Hypothesis Testing Over Discrete Memoryless Channels 1 Distributed Hypothesis Testing Over Discrete Memoryless Channels Sreejith Sreekumar and Deniz Gündüz Imperial College London, UK Email: {s.sreekumar15, d.gunduz}@imperial.ac.uk Abstract A distributed

More information

Solutions to Homework Set #3 Channel and Source coding

Solutions to Homework Set #3 Channel and Source coding Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code Chapter 2 Date Compression: Source Coding 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code 2.1 An Introduction to Source Coding Source coding can be seen as an efficient way

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

Polar Codes are Optimal for Write-Efficient Memories

Polar Codes are Optimal for Write-Efficient Memories Polar Codes are Optimal for Write-Efficient Memories Qing Li Department of Computer Science and Engineering Texas A & M University College Station, TX 7784 qingli@cse.tamu.edu Anxiao (Andrew) Jiang Department

More information

Rematch and Forward: Joint Source-Channel Coding for Communications

Rematch and Forward: Joint Source-Channel Coding for Communications Background ρ = 1 Colored Problem Extensions Rematch and Forward: Joint Source-Channel Coding for Communications Anatoly Khina Joint work with: Yuval Kochman, Uri Erez, Ram Zamir Dept. EE - Systems, Tel

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information 4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk

More information

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Network coding for multicast relation to compression and generalization of Slepian-Wolf Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues

More information

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University Classical codes for quantum broadcast channels arxiv:1111.3645 Ivan Savov and Mark M. Wilde School of Computer Science, McGill University International Symposium on Information Theory, Boston, USA July

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels Bike ie, Student Member, IEEE and Richard D. Wesel, Senior Member, IEEE Abstract Certain degraded broadcast channels

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

On Scalable Coding in the Presence of Decoder Side Information

On Scalable Coding in the Presence of Decoder Side Information On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

ECS 452: Digital Communication Systems 2015/2. HW 1 Due: Feb 5

ECS 452: Digital Communication Systems 2015/2. HW 1 Due: Feb 5 ECS 452: Digital Communication Systems 2015/2 HW 1 Due: Feb 5 Lecturer: Asst. Prof. Dr. Prapun Suksompong Instructions (a) Must solve all non-optional problems. (5 pt) (i) Write your first name and the

More information

NETWORK INFORMATION THEORY

NETWORK INFORMATION THEORY NETWORK INFORMATION THEORY OMISSIONS TO ALL PRINTINGS p., Bibliographic Notes, line should read: The capacity region of the deterministic broadcast channel was established independently by Marton( a) and

More information

Performance of Polar Codes for Channel and Source Coding

Performance of Polar Codes for Channel and Source Coding Performance of Polar Codes for Channel and Source Coding Nadine Hussami AUB, Lebanon, Email: njh03@aub.edu.lb Satish Babu Korada and üdiger Urbanke EPFL, Switzerland, Email: {satish.korada,ruediger.urbanke}@epfl.ch

More information

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca

More information

Lossy Distributed Source Coding

Lossy Distributed Source Coding Lossy Distributed Source Coding John MacLaren Walsh, Ph.D. Multiterminal Information Theory, Spring Quarter, 202 Lossy Distributed Source Coding Problem X X 2 S {,...,2 R } S 2 {,...,2 R2 } Ẑ Ẑ 2 E d(z,n,

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

Multimedia Communications. Scalar Quantization

Multimedia Communications. Scalar Quantization Multimedia Communications Scalar Quantization Scalar Quantization In many lossy compression applications we want to represent source outputs using a small number of code words. Process of representing

More information

Weak Flip Codes and its Optimality on the Binary Erasure Channel

Weak Flip Codes and its Optimality on the Binary Erasure Channel Weak Flip Codes and its Optimality on the Binary Erasure Channel Po-Ning Chen, Hsuan-Yin Lin, and Stefan M. Moser April 9, 24 Abstract Based on a new way of designing codes using codebook columns, a family

More information

1 Background on Information Theory

1 Background on Information Theory Review of the book Information Theory: Coding Theorems for Discrete Memoryless Systems by Imre Csiszár and János Körner Second Edition Cambridge University Press, 2011 ISBN:978-0-521-19681-9 Review by

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

Source and Channel Coding for Correlated Sources Over Multiuser Channels

Source and Channel Coding for Correlated Sources Over Multiuser Channels Source and Channel Coding for Correlated Sources Over Multiuser Channels Deniz Gündüz, Elza Erkip, Andrea Goldsmith, H. Vincent Poor Abstract Source and channel coding over multiuser channels in which

More information

Multicoding Schemes for Interference Channels

Multicoding Schemes for Interference Channels Multicoding Schemes for Interference Channels 1 Ritesh Kolte, Ayfer Özgür, Haim Permuter Abstract arxiv:1502.04273v1 [cs.it] 15 Feb 2015 The best known inner bound for the 2-user discrete memoryless interference

More information

Distributed Hypothesis Testing Over Discrete Memoryless Channels

Distributed Hypothesis Testing Over Discrete Memoryless Channels 1 Distributed Hypothesis Testing Over Discrete Memoryless Channels Sreejith Sreekumar and Deniz Gündüz Imperial College London, UK Email: {s.sreekumar15, d.gunduz}@imperial.ac.uk arxiv:1802.07665v6 [cs.it]

More information

On Multiple User Channels with State Information at the Transmitters

On Multiple User Channels with State Information at the Transmitters On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu

More information

Lecture 17: Differential Entropy

Lecture 17: Differential Entropy Lecture 17: Differential Entropy Differential entropy AEP for differential entropy Quantization Maximum differential entropy Estimation counterpart of Fano s inequality Dr. Yao Xie, ECE587, Information

More information