ECE 534 Information Theory - Midterm 2

Similar documents
Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding

Formal Modeling in Cognitive Science Lecture 29: Noisy Channel Model and Applications;

An Introduction to Information Theory: Notes

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Improved Capacity Bounds for the Binary Energy Harvesting Channel

Solutions to Homework Set #3 Channel and Source coding

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Shannon s noisy-channel theorem

ECE Information theory Final

HetNets: what tools for analysis?

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE/Stats 376A: Information theory Winter Lecture 5 Jan 24. Lecturer: David Tse Scribe: Michael X, Nima H, Geng Z, Anton J, Vivek B.

Lecture 21: Quantum Communication

Coding Along Hermite Polynomials for Gaussian Noise Channels

Convex Optimization methods for Computing Channel Capacity

On the capacity of the general trapdoor channel with feedback

ECE Information theory Final (Fall 2008)

Capacity of a channel Shannon s second theorem. Information Theory 1/33

4. Score normalization technical details We now discuss the technical details of the score normalization method.

Convolutional Codes. Lecture 13. Figure 93: Encoder for rate 1/2 constraint length 3 convolutional code.

I - Information theory basics

Radial Basis Function Networks: Algorithms

On Code Design for Simultaneous Energy and Information Transfer

Anytime communication over the Gilbert-Eliot channel with noiseless feedback

(Classical) Information Theory III: Noisy channel coding

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

Theory of Parallel Hardware May 11, 2004 Massachusetts Institute of Technology Charles Leiserson, Michael Bender, Bradley Kuszmaul

MATH 2710: NOTES FOR ANALYSIS

Improving on the Cutset Bound via a Geometric Analysis of Typical Sets

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

LECTURE 10. Last time: Lecture outline

Lecture 3: Channel Capacity

Appendix B Information theory from first principles

Note that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )).

Analysis of execution time for parallel algorithm to dertmine if it is worth the effort to code and debug in parallel

Feedback-error control

General Linear Model Introduction, Classes of Linear models and Estimation

LDPC codes for the Cascaded BSC-BAWGN channel

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

EE 4TM4: Digital Communications II. Channel Capacity

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Interactive Hypothesis Testing Against Independence

ON POLYNOMIAL SELECTION FOR THE GENERAL NUMBER FIELD SIEVE

q-ary Symmetric Channel for Large q

Lecture 4 Noisy Channel Coding

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Covert Communication with Channel-State Information at the Transmitter

Solved Problems. (a) (b) (c) Figure P4.1 Simple Classification Problems First we draw a line between each set of dark and light data points.

Lecture 22: Final Review

Notes on Instrumental Variables Methods

Outline. EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs) Simple Error Detection Coding

A Coordinate System for Gaussian Networks

Elements of Asymptotic Theory. James L. Powell Department of Economics University of California, Berkeley

Lecture 8: Channel Capacity, Continuous Random Variables

Interference Channels with Source Cooperation

ELEC546 Review of Information Theory

Homework Solution 4 for APPM4/5560 Markov Processes

Elementary Analysis in Q p

X 1 : X Table 1: Y = X X 2

Quiz 2 Date: Monday, November 21, 2016

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

The Poisson Regression Model

The Poisson Channel with Side Information

15-451/651: Design & Analysis of Algorithms October 23, 2018 Lecture #17: Prediction from Expert Advice last changed: October 25, 2018

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

PHYS 301 HOMEWORK #9-- SOLUTIONS

MATHEMATICAL MODELLING OF THE WIRELESS COMMUNICATION NETWORK

Named Entity Recognition using Maximum Entropy Model SEEM5680

ECE 6960: Adv. Random Processes & Applications Lecture Notes, Fall 2010

Revision of Lecture 5

Robustness of classifiers to uniform l p and Gaussian noise Supplementary material

Topic: Lower Bounds on Randomized Algorithms Date: September 22, 2004 Scribe: Srinath Sridhar

CSE 599d - Quantum Computing When Quantum Computers Fall Apart

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Introduction to Optimization (Spring 2004) Midterm Solutions

Towards understanding the Lorenz curve using the Uniform distribution. Chris J. Stephens. Newcastle City Council, Newcastle upon Tyne, UK

Cryptanalysis of Pseudorandom Generators

1 Gambler s Ruin Problem

Training sequence optimization for frequency selective channels with MAP equalization

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

8 STOCHASTIC PROCESSES

On the Capacity Region of the Gaussian Z-channel

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

Universal Finite Memory Coding of Binary Sequences

Lecture 14 February 28

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220

Hotelling s Two- Sample T 2

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

arxiv: v1 [physics.data-an] 26 Oct 2012

Lecture 12. Block Diagram

HENSEL S LEMMA KEITH CONRAD

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

A Comparison of Two Achievable Rate Regions for the Interference Channel

Transcription:

ECE 534 Information Theory - Midterm Nov.4, 009. 3:30-4:45 in LH03. You will be given the full class time: 75 minutes. Use it wisely! Many of the roblems have short answers; try to find shortcuts. You may bring and use two 8.5x double-sided crib sheets. No other notes or books are ermitted. No calculators are ermitted. Talking, assing notes, coying (and all other forms of cheating) is forbidden. Make sure you exlain your answers in a way that illustrates your understanding of the roblem. Ideas are imortant, not just the calculation. Partial marks will be given. Write all answers directly on this exam. Your name: Your UIN: Your signature: The exam has 4 questions, for a total of 65 oints. Question: 3 4 Total Points: 8 7 8 65 Score:

ECE534 Fall 009 Midterm. A sum channel. Let X = Y = {A, B, C, D} be the inut and outut alhabets of a discrete memoryless channel with transition robability matrix (y x), for 0 ɛ, δ given by ɛ ɛ 0 0 (y x) = ɛ ɛ 0 0 0 0 δ δ. 0 0 δ δ Notice that this channel with 4 inuts and oututs looks like the sum or union of two arallel subchannels with transition robability matrices ɛ ɛ δ δ (y x) =, ɛ ɛ (y x) =, δ δ with alhabets X = Y = {A, B} and X = Y = {C, D} resectively. (a) ( oints) Draw the transition robability diagram of this channel. X A B C D -ε ε ε -ε -δ δ δ -δ A B C D Y Solution: (b) (3 oints) Find the caacity of this channel if ɛ = δ = /. Solution: If ɛ = δ = / we have a symmetric channel, whose caacity we know is achieved by a uniform inut distribution and has caacity C = log Y H( a row of the transition robability matrix) = log (4) H(/, /, 0, 0) = = (bit er channel use) (c) (5 oints) Let (x) be the robability mass function on X and let (A) + (B) = α, (C) + (D) = α. Show that the mutual information between the inut X and the outut Y may be exressed as I(X; Y ) = H(α) + αi(x; Y X {A, B}) + ( α)i(x; Y X {C, D}). out of a ossible 0 oints

ECE534 Fall 009 Midterm Solution: Let θ be a random variable with the following robability mass function: { if x {A, B} () = α θ 0 if x {C, D} (0) = α We can then exress the mutual information between X and Y as I(X; Y ) = I(X; θ) + I(X; Y θ) = H(θ) H(θ X) + I(X; Y θ) = H(α) 0 + (θ = )I(X; Y θ = ) + (θ = 0)I(X; Y θ = 0) = H(α) + αi(x; Y X {A, B}) + ( α)i(x; Y x {C, D}) (d) ( oints) Let C and C be the caacities of the subchannels described by (y x) and (y x). Argue why max I(X; Y ) = max [H(α) + αc + ( α)c ]. (x) α Solution: Notice that C = max I(X; Y ), C = max I(X; Y ). (x):x {A,B} (x):x {C,D} Then since x {A, B} y {A, B} and x {C, D} y {C, D}, we see that max I(X; Y ) = max [H(α) + αi(x; Y X {A, B}) + ( α)i(x; Y X {C, D}) (x):x {A,B,C,D} (x):x {A,B,C,D},0 α = max [H(α) + 0 α max (x):x {A,B} = max 0 α [H(α) + αc + ( α)c ] I(X; Y X {A, B}) + max (x):x {C,D} I(X; Y X {C, D}) (e) (6 oints) Find the caacity C of the sum channel in terms of the caacities C and C of the sub-channels and NO other arameters. Solution: Problem (d) makes obtaining the caacity significantly easier since we know the caacities of the two binary symmetric channels are C = H(ɛ) and C = H(δ). Then finding the caacity of the sum channel amounts to a -D otimization over α, which may be achieved by setting the derivative of f(α) := H(α) + αc + ( α)c to zero to solve for α: df(α) dα = H (α) + C C = log ( α α = 0 α = ) + C C C C + C, ( α ) = C C + C Substituting this otimal value of α back into f(α), we obtain, after simlification that C = log ( C + C) out of a ossible 8 oints

ECE534 Fall 009 Midterm. out of a ossible 0 oints

ECE534 Fall 009 Midterm. Gaussian channels with interference. Consider a channel with indeendent transmitters and a single two-antenna receiver: user transmits X which is indeendent of the signal X transmitted by user. The signals of the two users are received at antenna and as Y and Y resectively as: Y = X + X + Z, Y = X + Z, where Z and Z are i.i.d N (0, σ ) additive white Gaussian noise. The signals of the two users X and X are indeendent and distributed as X N (0, P ), X N (0, P ). Our goal will be to determine the how much X can reliably communicate with the -antenna receiver while treating X as noise / interference - which deends on how the receiver rocesses the signals from the two receive antennas. Solution: All these questions rely on being able to calculate the different covariance matrices - al of which will be denoted by K with the aroriate subscrits, and using the fact that for random variable (vector) X with covariance matrix K X : h(x) = log ((πe) K X ) (a) (5 oints) Comute I(X ; Y, Y ). Solution: Here we need K Y,Y and K Y,Y X which can be obtained as E[Y K Y,Y = ] E[Y Y ] E[Y Y ] E[Y ] [ E[(X = + X + Z ) ] E[(X + X + Z )(X + Z )] E[(X + Z )(X + X + Z )] E[(X + Z ) ] P + 4P = + σ P P P + σ ] E[Y K Y,Y X = X ] E[Y Y X ] E[Y Y X ] E[Y X ] [ E[(X = + X + Z ) X ] E[(X + X + Z )(X + Z ) X ] E[(X + Z )(X + X + Z ) X ] E[(X + Z ) X ] 4P + σ = 0 0 σ ] Then I(X ; Y, Y ) = h(y, Y ) h(y, Y X ) = log ((πe) K Y,Y ) log ((πe) K Y,Y X ) = ( log (P + 4P + σ )(P + σ ) P ) (4P + σ )σ = ( log + P 4P + σ ) σ 4P + σ out of a ossible 5 oints

ECE534 Fall 009 Midterm (b) (5 oints) The receiver now decides to rocess the received signals Y and Y and see if/how it affects the otimal communication rate with X. Let Y b = Y + Y (the receiver sums the received signals). Comute I(X ; Y b ). Solution: Here we need K Y+Y and K Y+Y X which can be obtained as K Y+Y = E[(Y + Y ) ] = E[(X + X + Z + Z ) ] = 4P + 4P + σ K Y+Y X = E[(Y + Y ) X ] = E[(X + X + Z + Z ) X ] = E[(X + Z + Z ) ] = 4P + σ Then I(X ; Y + Y ) = h(y + Y ) h(y + Y X ) = log ((πe) K Y+Y ) log ((πe) K Y+Y X ) = ( log 4P + 4P + σ ) 4P + σ = ( log + P σ ) σ P + σ (c) (4 oints) Is Y b a sufficient statistic for decoding X? Solution: Y b is a sufficient statistic if I(X ; Y, Y ) = I(X, Y b ) (and we know that by the data rocessing inequality that I(X ; Y b ) I(X ; Y, Y ) - sufficient statistics lose no information! However, it is not a sufficient statistic since 4P + σ 4P + σ σ P + σ = 8P (4P + σ )(P + σ) > 0, P > 0 Hence I(X ; Y b ) < I(X ; Y, Y ), strictly and Y b is NOT a sufficient statistic. (d) ( oints) The receiver now decides to try and decode X using only Y, ignoring Y. Comute I(X ; Y ). Solution: I(X ; Y ) = h(y ) h(y X ) = log ((πe) K Y ) log ((πe) K Y X ) = ( log P + σ ) σ = ( log + P ) σ out of a ossible oints

ECE534 Fall 009 Midterm (e) ( oint) Find an examle of owers P and P for which I(X ; Y ) > I(X ; Y b ). Solution: For the solution in art (d) to be larger than that in art (c) we need σ P + σ < P > σ out of a ossible oints

ECE534 Fall 009 Midterm 3. True or false (T/F) and short answer. (a) ( oints) Find the 4-ary Huffman code (D = 4) for the source with robability mass function ( 8 36, 7 36, 6 36, 5 36, 4 36, 3 36, 36, 36 ). Solution: For this one, the key is to realize that the otimal Huffman code will have Dummy (0 robability) extra symbols, resulting in the following Huffman tree: Huffman code () () (3) (00) (0) (0) (030) (03) 8 7 6 5 4 3 D D 8 7 6 5 4 3 3 5 8 7 6 36 (b) ( oints) Describe the meaning and use of the rate-distortion function in two sentences - I m looking for meaning and intuition rather than formulas. Solution: Each rate-distortion air (R,D) on the rate-distortion function R(D) describes the minimal achievable rate (number of bits er source symbol) needed to reresent the source under consideration to within an exected distortion of D. The R(D) function is useful in lossy (non-erfect) comression or sources. (c) ( oints) Comare the caacities C and C of the channels where for the first channel, Y = (X mod 0) for X = {,,, 00} and for the second channel Y = (X mod 9) for X = {,,, 90}. Solution: By symmetry, uniform inuts will achieve uniform oututs and so the caacity C = log Y = log(0) which is greater than the caacity C = log Y = log(9). (d) ( oint) T/F: erfect feedback can increase the caacity of a discrete channel with memory. Solution: True. (e) ( oints) T/F: the differential entroy h(x ) of a continuous random variable X which is uniform on [a, b] is twice the differential entroy h(x ) of a continuous random variable X which is uniform on [a, b]. Solution: False. h(x ) = log(b a) = log((b a)) = +log(b a), while h(x ) = log(b a). This statement is only true if (b a) = but not in general. (f) (3 oints) Outline, in about 3-4 sentences, the main oints/techniques used in the achievability roof of the channel coding theorem. Solution: The main roof techniques are random coding, joint tyicality decoding and bounding the robability of error using what we know about how likely it is for two indeendently chosen sequences to be jointly tyical. The achievability roof follows these main lines: out of a ossible oints

ECE534 Fall 009 Midterm Generate a random codebook, which consists of nr sequences of length n, were each element is generated i.i.d. according to (x). Encoding of message w takes lace by looking u the w-th sequence in the codebook and sending that. The receiver decodes w using a joint tyicality decoder whereby it declares w was sent if there exists one and only one w such that x n (w) and the received y n re jointly tyical. Otherwise it declares an error. Analyze the robability of error in this scheme - we need to show that the maximal robability of error decays to 0 as the blocklength n tends to infinity. This will follow from arguing that the robability that indeendently generated x n and y n have a robability of n(i(x;y ) ɛ) of being jointly tyical - and there are maximally nr indeendently generated tyical x n which are no the correct one. Last art of the roof involves showing that if the *average* robability of error decaying to 0 over all randomly chosen codebooks imlies that there exists a codebook whose maximal robability of error will also decay to zero. out of a ossible 0 oints

ECE534 Fall 009 Midterm 4. Caacity of several simle channels. Find the caacity of the following channels AND the inut distribution which achieves this caacity. (a) (4 oints) Consider arallel Gaussian channels Y = X + Z and Y = X + Z where Z and Z are indeendent, zero mean additive white Gaussian noise Z N (0, σ = 3) and Z N (0, σ = 7) subject to a total ower constraint of P = 0. Solution: This is the classical, simlest waterfilling solution. Draw it out and see that we will fill 7 units of ower into channel and 3 units of ower into channel, thereby obtaining a caacity of C = ( log + 7 ) + ( 3 log + 3 ), 7 Variance Water-level = 0 7 3 which is achieved by an inut distribution X N (0, 7) and X N (0, 3), and X, X indeendent. Channel Channel Channel (b) (6 oints) Again consider arallel Gaussian channels, but now you re constrained to send the same signal X on both channels, i.e. Y = X + Z and Y = X + Z where Z and Z are indeendent, zero mean additive white Gaussian noise Z N (0, σ = 3) and Z N (0, σ = 7) subject to a total ower constraint of P = 0. Solution: You can roceed with waterfilling, but it s easier if you just go directly from the caacity since due to out ower constraint and due to the fact that we are forced to send the same signal on both channels - we know that P = P = P/ = 5. Then, keeing things symbolic as long as ossible for illustration, and noting that we are inuting Gaussian random variables (so the outut are Gaussian as well) [ P K Y,Y = + ] σ P σ, K Y,Y X = 0 0 σ P P + σ So, C = log ( max I(X; Y, Y ) = h(y, Y ) h(y, Y X) (x) ( = ( P log + σ )( P + ) σ ) P 4 + 5(3+7) 3 7 = log ( + ), achieved by X N (0, 5). P σ σ (σ + σ ) σ σ ) out of a ossible 0 oints

ECE534 Fall 009 Midterm (c) (4 oints) Cascade of two binary symmetric channels with crossover robability without encoding between stages: X 0 0 0 Y Solution: A cascade of binary symmetric channels is again a binary symmetric channel with a new corssover robability = ( ) + ( ) = ( ). Then the otimal inut distribution is uniform (x) = {/, /} and the caacity is C = H(( ). (d) (4 oints) Cascade of two binary symmetric channels with crossover robability with encoding between stages: X 0 0 0 0 ENCODING Y Solution: When we can re-encode the symbols after the first BSC, the caacity becomes C = min(c, C }, where C and C are the caacities of the first and second BSCs, resectively. So, we see that, by symmetry, C = H(), achieved for (x) = {/, /}, uniform. out of a ossible 8 oints