Information Dimension
|
|
- Nathaniel Newton
- 5 years ago
- Views:
Transcription
1 Information Dimension Mina Karzand Massachusetts Institute of Technology November 16, / 26
2 2 / 26
3 Let X would be a real-valued random variable. For m N, the m point uniform quantized version of X is shown by X m = mx m Thus, X m Z/m Lower Information Dimension: d(x ) = lim inf m Upper Information Dimension: d(x ) = lim sup m H( X m ) log m H( X m ) log m 3 / 26
4 If d(x ) = d(x ), then Entropy of dimension d(x ): H( X m ) d(x ) = lim m log m Ĥ(X ) = lim m [H( X m) d(x ) log m] 4 / 26
5 If H( X n ) <, then 0 d(x n ) d(x n ) n. 5 / 26
6 If H( X n ) <, then 0 d(x n ) d(x n ) n. If E[log (1 + X )] <, then d(x ) <. 5 / 26
7 If H( X n ) <, then 0 d(x n ) d(x n ) n. If E[log (1 + X )] <, then d(x ) <. It is sufficient to restrict to the exponential subsequence m = 2 l. Define [.] l. m, H([X ] l ) d(x ) = lim n l 5 / 26
8 If H( X n ) <, then 0 d(x n ) d(x n ) n. If E[log (1 + X )] <, then d(x ) <. It is sufficient to restrict to the exponential subsequence m = 2 l. Define [.] l. m, H([X ] l ) d(x ) = lim n l 5 / 26
9 Translation Invariance, x n R n, d(x n + X n ) = d(x n ) 6 / 26
10 Translation Invariance, x n R n, d(x n + X n ) = d(x n ) Scale Invariance, α 0, d(αx n ) = d(x n ) 6 / 26
11 Translation Invariance, x n R n, d(x n + X n ) = d(x n ) Scale Invariance, α 0, d(αx n ) = d(x n ) If X n and Y n are independent, max{d(x n ), d(y n )} d(x n + Y n ) d(x n ) + d(y n ) 6 / 26
12 Translation Invariance, x n R n, d(x n + X n ) = d(x n ) Scale Invariance, α 0, d(αx n ) = d(x n ) If X n and Y n are independent, max{d(x n ), d(y n )} d(x n + Y n ) d(x n ) + d(y n ) If {X i } are independent and d(x i ) exists for all i, n d(x n ) = d(x i ) i=1 6 / 26
13 Translation Invariance, x n R n, d(x n + X n ) = d(x n ) Scale Invariance, α 0, d(αx n ) = d(x n ) If X n and Y n are independent, max{d(x n ), d(y n )} d(x n + Y n ) d(x n ) + d(y n ) If {X i } are independent and d(x i ) exists for all i, n d(x n ) = d(x i ) If X n, Y n and Z n are independent, then d(x n + Y n + Z n ) + d(z n ) d(x n + Z n ) + d(y n ) + d(z n ) i=1 6 / 26
14 A probability distribution can be uniquely represented as the mixture v = pv d + qv c + rv s p + q + r = 1 v d : purely atomic prob. measure (discrete part) v c : absolutely continuous probability measure v s : probability measure singular with respect to Lebesgue measure 7 / 26
15 Theorem: Let X be a random variable s.t. H( X ) <. Its distribution can be represented as Then d(x ) = ρ and v = (1 ρ)v d + ρv c Ĥ(X ) = (1 ρ)h(v d ) + ρh(v c ) + h b (ρ) 8 / 26
16 Renyi entropy of order α of a discrete random variable: y p y log 1 p y, α = 1 H α (Y ) = 1 log max y p y, α = 1 1 α log ( y pα y ), α 1,. 9 / 26
17 Renyi entropy of order α of a discrete random variable: y p y log 1 p y, α = 1 H α (Y ) = 1 log max y p y, α = 1 1 α log ( y pα y ), α 1,. d α (X ) = lim inf m d α (X ) = lim sup m H α ( X m ) log m H α ( X m ) log m 9 / 26
18 Renyi entropy of order α of a discrete random variable: y p y log 1 p y, α = 1 H α (Y ) = 1 log max y p y, α = 1 1 α log ( y pα y ), α 1,. d α (X ) = lim inf m d α (X ) = lim sup m H α ( X m ) log m H α ( X m ) log m Ĥ α (X ) = lim m [H α( X m ) d α (X ) log m] 9 / 26
19 Theorem: Let X be a real random variable, satisfying the property H α( X )< with the distribution represented as: Then, v = pv d + qv c + rv s For α > 1: If p > 0 (X has a discrete component), then d α (X ) = 0 and Ĥ α (X ) = H α (v d ) + α 1 α log p. 10 / 26
20 Theorem: Let X be a real random variable, satisfying the property H α( X )< with the distribution represented as: Then, v = pv d + qv c + rv s For α > 1: If p > 0 (X has a discrete component), then d α (X ) = 0 and Ĥ α (X ) = H α (v d ) + α 1 α log p. For α < 1: If q > 0(X has an absolutely continuous part), then d α (X ) = 1 and Ĥ α (X ) = h α (v c ) + α a α log q 10 / 26
21 Dyadic expansion of X can be written as X = (X ) j 2 j j=1 There is a one to one correspondence between X and the binary random process {(X ) j, j N} d(x ) = lim inf i d(x ) = lim sup i H((X ) 1, (X ) 2,..., (X ) i ) i H((X ) 1, (X ) 2,..., (X ) i ) i Random variables whose lower and upper information dimension differ can be constructed from processes with different lower and upper entropy rate. 11 / 26
22 Cantor Distribution C 0 = [0, 1] C 1 = [0, 1/3] [2/3, 1] C 2 = [0, 1/9] [2/9, 1/3] [2/3, 7/9] [8/9, 1] C 3 = The support of the Cantor distribution is the Cantor set i=1 C i. 12 / 26
23 Cantor Distribution C 0 = [0, 1] C 1 = [0, 1/3] [2/3, 1] C 2 = [0, 1/9] [2/9, 1/3] [2/3, 7/9] [8/9, 1] C 3 = The support of the Cantor distribution is the Cantor set i=1 C i. 12 / 26
24 Degrees of freedom of the interference channel Channel Model: K-user real-valued memoryless Gaussian interference channel with a fixed deterministic channel matrix H = [h ij ] (known at encoder and decoder), where at each symbol epoch the i th user transmits X i and the i th decoder receives where {X i, N i } K i=1 N i N (0, 1). Y i = k snrhij X j + N i j=1 are independent with E[X 2 i ] 1 and 13 / 26
25 Sum-rate capacity: { K } C(H, snr) max R i : R K C(H, snr) i=1 Degrees of freedom or the multiplexing gain DOF (H) = C(H, snr) lim snr 1 2 log snr 14 / 26
26 Theorem: Let X be independent of N which is standard normal random variable. Denote I (X, snr) = I (X ; snrx + N) Then, I (X, snr) lim = d(x ) snr 1 2 log snr Mutual information is maximized asymptotically by any absolutely continuous input distribution, where d(x ) = / 26
27 Information dimension under projection Almost every projection preserves the dimension. But, computing the dimension for individual projections is in general difficult. Theorem: Let A R m n with m n. Then for any X n, d(ax n ) min{d(x n ), rank(a)} Theorem: Let α (1, 2] and m n. Then for almost every A R m n, d α (AX n ) = min{d α (X n ), m} 16 / 26
28 Theorem: Let, K K dof (X K, H) d h ij X j d i=1 j=1 j i h ij X j Then, DOF (H) = sup X K dof (X K, H) where the supremum is over independent X 1, X 2,..., X K such that for some fixed C > 0. H( X i ) C 17 / 26
29 Theorem: Let, K K dof (X K, H) d h ij X j d i=1 j=1 j i h ij X j Then, DOF (H) = sup X K dof (X K, H) where the supremum is over independent X 1, X 2,..., X K such that H( X i ) C for some fixed C > 0. Applies to non-gaussian noise as long as finite non-gaussianness, D(N N G ) <. 17 / 26
30 dof (X K, H) K K d h ij X j d h ij X j j=1 j i }{{}}{{} info. dim. of the i-th user info. dim. of the interference i=1 18 / 26
31 X n i 1 C(H, snr) = lim n n sup X n 1,...,X n K K i=1 = [X i,1, X i,2,..., X i,n ]: i th input user. sup is over independent X n 1,..., X n K. I (X n i ; Y n i ) I (Xi n ; Yi n ) = I (X1 n,..., XK n ; Y i n ) I (X1 n,..., XK n ; Y n K = I h ij Xj n, snr I h ij Xj n, snr j=1 j i i Xi n ) 19 / 26
32 DOF (H) = lim lim sup 1 snr n X1 n,...,x K n n 2 log snr K K I h ij Xj n, snr I i=1 j=1 j i = lim sup 1 lim n X1 n,...,x K n snr n 2 log snr K K I h ij Xj n, snr I i=1 j=1 j i h ij Xj n, snr h ij Xj n, snr 20 / 26
33 DOF (H) = lim lim sup 1 snr n X1 n,...,x K n n 2 log snr K K I h ij Xj n, snr I i=1 j=1 j i = lim sup 1 lim n X1 n,...,x K n snr n 2 log snr K K I h ij Xj n, snr I i=1 j=1 j i h ij Xj n, snr h ij Xj n, snr I (., snr) = d(.) 2 log snr + o(log snr) 20 / 26
34 1 DOF (H) = lim n n K K d h ij Xj n, snr d j=1 j i i=1 h ij Xj n, snr SINGLE LETTERIZATION AND EXAMPLES 21 / 26
35 Two user IC ([ ]) a b DOF c d = sup X1 X 2 d(ax 1 + bx 2 ) + d(cx 1 + dx 2 ) d(bx 2 ) d(cx 1 ) 0, a = d = 0 = 2, a 0, d 0, b = c = 0 1, otherwise 22 / 26
36 Many-to-one IC: h 11 h 12 h 13 h 1K 0 h DOF = K h KK Achieved by choosing X 1 discrete and the rest absolutely continuous. 23 / 26
37 One-to-Many IC: h h 21 h DOF = K 1 h K1 0 0 h KK Achieved by choosing X 1 discrete and the rest absolutely continuous. 24 / 26
38 MAC: 1 1 DOF.. = / 26
39 Information Dimension and Rate Distortion Theory For scalar source and MSE distortion, whenever d(x ) exists and is finite, as D 0 R X (D) = d(x ) 2 log 1 D + o(log D) 26 / 26
40 Information Dimension and Rate Distortion Theory For scalar source and MSE distortion, whenever d(x ) exists and is finite, as D 0 R X (D) = d(x ) 2 log 1 D + o(log D) X is discrete and H(X ) < : R X (D) = H(X ) + o(1) X is continuous and h(x ) > : R X (D) = 1 2 log 1 + h(x ) + o(1) 2πeD X is discrete-continuous mixed: R X (D) = ρ 2 log 1 D + Ĥ(X ) + o(1) 26 / 26
Coding over Interference Channels: An Information-Estimation View
Coding over Interference Channels: An Information-Estimation View Shlomo Shamai Department of Electrical Engineering Technion - Israel Institute of Technology Information Systems Laboratory Colloquium
More informationDegrees of Freedom invector Interference Channels
Degrees of Freedom invector Interference Channels David Stotz and Helmut Bölcsei, Fellow, IEEE Abstract This paper continues the Wu-Shamai-Verdú program 3] on characterizing the degrees of freedom DoF
More informationLecture 8: MIMO Architectures (II) Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH
MIMO : MIMO Theoretical Foundations of Wireless Communications 1 Wednesday, May 25, 2016 09:15-12:00, SIP 1 Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless Communication 1 / 20 Overview MIMO
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More informationOn the Optimality of Treating Interference as Noise in Competitive Scenarios
On the Optimality of Treating Interference as Noise in Competitive Scenarios A. DYTSO, D. TUNINETTI, N. DEVROYE WORK PARTIALLY FUNDED BY NSF UNDER AWARD 1017436 OUTLINE CHANNEL MODEL AND PAST WORK ADVANTAGES
More informationShannon Theory for Compressed Sensing
Shannon Theory for Compressed Sensing Yihong Wu A Dissertation Presented to the Faculty of Princeton University in Candidacy for the Degree of Doctor of Philosophy Recommended for Acceptance by the Department
More informationMMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only
MMSE Dimension Yihong Wu Department of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú Department of Electrical Engineering Princeton University
More informationLecture 6 Channel Coding over Continuous Channels
Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have
More information18.2 Continuous Alphabet (discrete-time, memoryless) Channel
0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not
More informationThe Robustness of Dirty Paper Coding and The Binary Dirty Multiple Access Channel with Common Interference
The and The Binary Dirty Multiple Access Channel with Common Interference Dept. EE - Systems, Tel Aviv University, Tel Aviv, Israel April 25th, 2010 M.Sc. Presentation The B/G Model Compound CSI Smart
More information5. Density evolution. Density evolution 5-1
5. Density evolution Density evolution 5-1 Probabilistic analysis of message passing algorithms variable nodes factor nodes x1 a x i x2 a(x i ; x j ; x k ) x3 b x4 consider factor graph model G = (V ;
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More informationMulti-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems
Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User
More informationCh. 8 Math Preliminaries for Lossy Coding. 8.5 Rate-Distortion Theory
Ch. 8 Math Preliminaries for Lossy Coding 8.5 Rate-Distortion Theory 1 Introduction Theory provide insight into the trade between Rate & Distortion This theory is needed to answer: What do typical R-D
More informationMulti-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems
Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User
More informationLecture 2. Capacity of the Gaussian channel
Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN
More informationEntropies & Information Theory
Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information
More informationC.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University
Quantization C.M. Liu Perceptual Signal Processing Lab College of Computer Science National Chiao-Tung University http://www.csie.nctu.edu.tw/~cmliu/courses/compression/ Office: EC538 (03)5731877 cmliu@cs.nctu.edu.tw
More informationNote that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )).
l I ~-16 / (a) (5 points) What is the capacity Cr of the channel X -> Y? What is C of the channel Y - Z? (b) (5 points) What is the capacity C 3 of the cascaded channel X -3 Z? (c) (5 points) A ow let.
More informationf (x) = k=0 f (0) = k=0 k=0 a k k(0) k 1 = a 1 a 1 = f (0). a k k(k 1)x k 2, k=2 a k k(k 1)(0) k 2 = 2a 2 a 2 = f (0) 2 a k k(k 1)(k 2)x k 3, k=3
1 M 13-Lecture Contents: 1) Taylor Polynomials 2) Taylor Series Centered at x a 3) Applications of Taylor Polynomials Taylor Series The previous section served as motivation and gave some useful expansion.
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationRevision of Lecture 5
Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information
More informationInformation Theory. Coding and Information Theory. Information Theory Textbooks. Entropy
Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is
More informationLecture 8: Channel Capacity, Continuous Random Variables
EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel
More information3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions
Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationDraft. On Limiting Expressions for the Capacity Region of Gaussian Interference Channels. Mojtaba Vaezi and H. Vincent Poor
Preliminaries Counterexample Better Use On Limiting Expressions for the Capacity Region of Gaussian Interference Channels Mojtaba Vaezi and H. Vincent Poor Department of Electrical Engineering Princeton
More informationFRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY
FRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY Emina Soljanin Mathematical Sciences Research Center, Bell Labs April 16, 23 A FRAME 1 A sequence {x i } of vectors in a Hilbert space with the property
More informationEE 4TM4: Digital Communications II Scalar Gaussian Channel
EE 4TM4: Digital Communications II Scalar Gaussian Channel I. DIFFERENTIAL ENTROPY Let X be a continuous random variable with probability density function (pdf) f(x) (in short X f(x)). The differential
More informationLecture 4. Capacity of Fading Channels
1 Lecture 4. Capacity of Fading Channels Capacity of AWGN Channels Capacity of Fading Channels Ergodic Capacity Outage Capacity Shannon and Information Theory Claude Elwood Shannon (April 3, 1916 February
More informationMARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for
MARKOV CHAINS A finite state Markov chain is a sequence S 0,S 1,... of discrete cv s from a finite alphabet S where q 0 (s) is a pmf on S 0 and for n 1, Q(s s ) = Pr(S n =s S n 1 =s ) = Pr(S n =s S n 1
More informationLecture 8: Shannon s Noise Models
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have
More informationLecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157
Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x
More informationInformation Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem
Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE
More informationLecture 14 February 28
EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables
More informationPrinciples of Communications
Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @
More informationOn the Capacity and Degrees of Freedom Regions of MIMO Interference Channels with Limited Receiver Cooperation
On the Capacity and Degrees of Freedom Regions of MIMO Interference Channels with Limited Receiver Cooperation Mehdi Ashraphijuo, Vaneet Aggarwal and Xiaodong Wang 1 arxiv:1308.3310v1 [cs.it] 15 Aug 2013
More informationPolar codes for the m-user MAC and matroids
Research Collection Conference Paper Polar codes for the m-user MAC and matroids Author(s): Abbe, Emmanuel; Telatar, Emre Publication Date: 2010 Permanent Link: https://doi.org/10.3929/ethz-a-005997169
More informationInformation Theory for Wireless Communications, Part II:
Information Theory for Wireless Communications, Part II: Lecture 5: Multiuser Gaussian MIMO Multiple-Access Channel Instructor: Dr Saif K Mohammed Scribe: Johannes Lindblom In this lecture, we give the
More informationQuiz 2 Date: Monday, November 21, 2016
10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,
More informationUsing Noncoherent Modulation for Training
EE8510 Project Using Noncoherent Modulation for Training Yingqun Yu May 5, 2005 0-0 Noncoherent Channel Model X = ρt M ΦH + W Rayleigh flat block-fading, T: channel coherence interval Marzetta & Hochwald
More informationLecture 1: The Multiple Access Channel. Copyright G. Caire 12
Lecture 1: The Multiple Access Channel Copyright G. Caire 12 Outline Two-user MAC. The Gaussian case. The K-user case. Polymatroid structure and resource allocation problems. Copyright G. Caire 13 Two-user
More informationECE598: Information-theoretic methods in high-dimensional statistics Spring 2016
ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationDegrees-of-Freedom Robust Transmission for the K-user Distributed Broadcast Channel
/33 Degrees-of-Freedom Robust Transmission for the K-user Distributed Broadcast Channel Presented by Paul de Kerret Joint work with Antonio Bazco, Nicolas Gresset, and David Gesbert ESIT 2017 in Madrid,
More informationEE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.
EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on
More informationIterative Quantization. Using Codes On Graphs
Iterative Quantization Using Codes On Graphs Emin Martinian and Jonathan S. Yedidia 2 Massachusetts Institute of Technology 2 Mitsubishi Electric Research Labs Lossy Data Compression: Encoding: Map source
More informationMultiuser Capacity in Block Fading Channel
Multiuser Capacity in Block Fading Channel April 2003 1 Introduction and Model We use a block-fading model, with coherence interval T where M independent users simultaneously transmit to a single receiver
More informationChapter 9 Fundamental Limits in Information Theory
Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For
More informationOn the Applications of the Minimum Mean p th Error to Information Theoretic Quantities
On the Applications of the Minimum Mean p th Error to Information Theoretic Quantities Alex Dytso, Ronit Bustin, Daniela Tuninetti, Natasha Devroye, H. Vincent Poor, and Shlomo Shamai (Shitz) Outline Notation
More informationControl Over Noisy Channels
IEEE RANSACIONS ON AUOMAIC CONROL, VOL??, NO??, MONH?? 004 Control Over Noisy Channels Sekhar atikonda, Member, IEEE, and Sanjoy Mitter, Fellow, IEEE, Abstract Communication is an important component of
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationRevision of Lecture 4
Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical
More informationOn the Secrecy Capacity of the Z-Interference Channel
On the Secrecy Capacity of the Z-Interference Channel Ronit Bustin Tel Aviv University Email: ronitbustin@post.tau.ac.il Mojtaba Vaezi Princeton University Email: mvaezi@princeton.edu Rafael F. Schaefer
More informationShannon s A Mathematical Theory of Communication
Shannon s A Mathematical Theory of Communication Emre Telatar EPFL Kanpur October 19, 2016 First published in two parts in the July and October 1948 issues of BSTJ. First published in two parts in the
More informationInterference Alignment at Finite SNR for TI channels
Interference Alignment at Finite SNR for Time-Invariant channels Or Ordentlich Joint work with Uri Erez EE-Systems, Tel Aviv University ITW 20, Paraty Background and previous work The 2-user Gaussian interference
More informationApproximate Capacity of Fast Fading Interference Channels with no CSIT
Approximate Capacity of Fast Fading Interference Channels with no CSIT Joyson Sebastian, Can Karakus, Suhas Diggavi Abstract We develop a characterization of fading models, which assigns a number called
More informationApproximately achieving the feedback interference channel capacity with point-to-point codes
Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used
More informationPROOF OF ZADOR-GERSHO THEOREM
ZADOR-GERSHO THEOREM FOR VARIABLE-RATE VQ For a stationary source and large R, the least distortion of k-dim'l VQ with nth-order entropy coding and rate R or less is δ(k,n,r) m k * σ 2 η kn 2-2R = Z(k,n,R)
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationLecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1
: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1 Rayleigh Friday, May 25, 2018 09:00-11:30, Kansliet 1 Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless
More informationArimoto Channel Coding Converse and Rényi Divergence
Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationOn the Capacity of the Interference Channel with a Relay
On the Capacity of the Interference Channel with a Relay Ivana Marić, Ron Dabora and Andrea Goldsmith Stanford University, Stanford, CA {ivanam,ron,andrea}@wsl.stanford.edu Abstract Capacity gains due
More informationGeneralized Writing on Dirty Paper
Generalized Writing on Dirty Paper Aaron S. Cohen acohen@mit.edu MIT, 36-689 77 Massachusetts Ave. Cambridge, MA 02139-4307 Amos Lapidoth lapidoth@isi.ee.ethz.ch ETF E107 ETH-Zentrum CH-8092 Zürich, Switzerland
More informationCompression and Coding
Compression and Coding Theory and Applications Part 1: Fundamentals Gloria Menegaz 1 Transmitter (Encoder) What is the problem? Receiver (Decoder) Transformation information unit Channel Ordering (significance)
More informationLecture 22: Final Review
Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information
More informationK User Interference Channel with Backhaul
1 K User Interference Channel with Backhaul Cooperation: DoF vs. Backhaul Load Trade Off Borna Kananian,, Mohammad A. Maddah-Ali,, Babak H. Khalaj, Department of Electrical Engineering, Sharif University
More informationBasic Principles of Video Coding
Basic Principles of Video Coding Introduction Categories of Video Coding Schemes Information Theory Overview of Video Coding Techniques Predictive coding Transform coding Quantization Entropy coding Motion
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationEstimation of the Capacity of Multipath Infrared Channels
Estimation of the Capacity of Multipath Infrared Channels Jeffrey B. Carruthers Department of Electrical and Computer Engineering Boston University jbc@bu.edu Sachin Padma Department of Electrical and
More informationCoding for Discrete Source
EGR 544 Communication Theory 3. Coding for Discrete Sources Z. Aliyazicioglu Electrical and Computer Engineering Department Cal Poly Pomona Coding for Discrete Source Coding Represent source data effectively
More informationLossy Compression Coding Theorems for Arbitrary Sources
Lossy Compression Coding Theorems for Arbitrary Sources Ioannis Kontoyiannis U of Cambridge joint work with M. Madiman, M. Harrison, J. Zhang Beyond IID Workshop, Cambridge, UK July 23, 2018 Outline Motivation
More informationEntropy, Inference, and Channel Coding
Entropy, Inference, and Channel Coding Sean Meyn Department of Electrical and Computer Engineering University of Illinois and the Coordinated Science Laboratory NSF support: ECS 02-17836, ITR 00-85929
More informationCS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability
More informationStrong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach
Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach Silas Fong (Joint work with Vincent Tan) Department of Electrical & Computer Engineering National University
More informationFunctional Properties of MMSE
Functional Properties of MMSE Yihong Wu epartment of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú epartment of Electrical Engineering
More information19. Channel coding: energy-per-bit, continuous-time channels
9. Channel coding: energy-per-bit, continuous-time channels 9. Energy per bit Consider the additive Gaussian noise channel: Y i = X i + Z i, Z i N ( 0, ). (9.) In the last lecture, we analyzed the maximum
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationVector Channel Capacity with Quantized Feedback
Vector Channel Capacity with Quantized Feedback Sudhir Srinivasa and Syed Ali Jafar Electrical Engineering and Computer Science University of California Irvine, Irvine, CA 9697-65 Email: syed@ece.uci.edu,
More informationInformation Theory - Entropy. Figure 3
Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationNonanticipative Rate Distortion Function and. Relations to Filtering Theory
Nonanticipative Rate Distortion Function and 1 Relations to Filtering Theory Charalambos D. Charalambous, Photios A. Stavrou, Student Member, IEEE, and Nasir U. Ahmed arxiv:1210.1266v3 [cs.it] 18 Sep 2013
More informationDigital Image Processing Lectures 25 & 26
Lectures 25 & 26, Professor Department of Electrical and Computer Engineering Colorado State University Spring 2015 Area 4: Image Encoding and Compression Goal: To exploit the redundancies in the image
More informationCHAPTER 3. P (B j A i ) P (B j ) =log 2. j=1
CHAPTER 3 Problem 3. : Also : Hence : I(B j ; A i ) = log P (B j A i ) P (B j ) 4 P (B j )= P (B j,a i )= i= 3 P (A i )= P (B j,a i )= j= =log P (B j,a i ) P (B j )P (A i ).3, j=.7, j=.4, j=3.3, i=.7,
More informationHalf-Duplex Gaussian Relay Networks with Interference Processing Relays
Half-Duplex Gaussian Relay Networks with Interference Processing Relays Bama Muthuramalingam Srikrishna Bhashyam Andrew Thangaraj Department of Electrical Engineering Indian Institute of Technology Madras
More informationChapter 4: Continuous channel and its capacity
meghdadi@ensil.unilim.fr Reference : Elements of Information Theory by Cover and Thomas Continuous random variable Gaussian multivariate random variable AWGN Band limited channel Parallel channels Flat
More informationError Exponent Region for Gaussian Broadcast Channels
Error Exponent Region for Gaussian Broadcast Channels Lihua Weng, S. Sandeep Pradhan, and Achilleas Anastasopoulos Electrical Engineering and Computer Science Dept. University of Michigan, Ann Arbor, MI
More information3. Coding theory 3.1. Basic concepts
3. CODING THEORY 1 3. Coding theory 3.1. Basic concepts In this chapter we will discuss briefly some aspects of error correcting codes. The main problem is that if information is sent via a noisy channel,
More informationPrinciples of Communications
Principles of Communications Weiyao Lin, PhD Shanghai Jiao Tong University Chapter 4: Analog-to-Digital Conversion Textbook: 7.1 7.4 2010/2011 Meixia Tao @ SJTU 1 Outline Analog signal Sampling Quantization
More informationEE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.
EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported
More informationWIRELESS networks with multiple users are interference-limited
4170 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 60, NO. 7, JULY 2014 On the Capacity and Degrees of Freedom Regions of Two-User MIMO Interference Channels With Limited Receiver Cooperation Mehdi Ashraphijuo,
More informationInteractions of Information Theory and Estimation in Single- and Multi-user Communications
Interactions of Information Theory and Estimation in Single- and Multi-user Communications Dongning Guo Department of Electrical Engineering Princeton University March 8, 2004 p 1 Dongning Guo Communications
More informationSimultaneous SDR Optimality via a Joint Matrix Decomp.
Simultaneous SDR Optimality via a Joint Matrix Decomposition Joint work with: Yuval Kochman, MIT Uri Erez, Tel Aviv Uni. May 26, 2011 Model: Source Multicasting over MIMO Channels z 1 H 1 y 1 Rx1 ŝ 1 s
More informationDiversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT. ECE 559 Presentation Hoa Pham Dec 3, 2007
Diversity-Multiplexing Tradeoff in MIMO Channels with Partial CSIT ECE 559 Presentation Hoa Pham Dec 3, 2007 Introduction MIMO systems provide two types of gains Diversity Gain: each path from a transmitter
More informationBlock 2: Introduction to Information Theory
Block 2: Introduction to Information Theory Francisco J. Escribano April 26, 2015 Francisco J. Escribano Block 2: Introduction to Information Theory April 26, 2015 1 / 51 Table of contents 1 Motivation
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationx log x, which is strictly convex, and use Jensen s Inequality:
2. Information measures: mutual information 2.1 Divergence: main inequality Theorem 2.1 (Information Inequality). D(P Q) 0 ; D(P Q) = 0 iff P = Q Proof. Let ϕ(x) x log x, which is strictly convex, and
More information