National University of Singapore Department of Electrical & Computer Engineering. Examination for

Similar documents
Lecture 3: Channel Capacity

Shannon s noisy-channel theorem

Lecture 4 Channel Coding

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

ECE Information theory Final

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

LECTURE 13. Last time: Lecture outline

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

LECTURE 10. Last time: Lecture outline

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16

ECE Information theory Final (Fall 2008)

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Homework Set #2 Data Compression, Huffman code and AEP

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Lecture 4 Noisy Channel Coding

Exercise 1. = P(y a 1)P(a 1 )

Capacity of a channel Shannon s second theorem. Information Theory 1/33

EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

X 1 : X Table 1: Y = X X 2

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Solutions to Homework Set #3 Channel and Source coding

ELEC546 Review of Information Theory

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

ECE 534 Information Theory - Midterm 2

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

The Poisson Channel with Side Information

Quiz 2 Date: Monday, November 21, 2016

Lecture 22: Final Review

EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

(Classical) Information Theory III: Noisy channel coding

Appendix B Information theory from first principles

Lecture 11: Polar codes construction

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Cut-Set Bound and Dependence Balance Bound

Lecture 10: Broadcast Channel and Superposition Coding

Reliable Computation over Multiple-Access Channels

Lecture 14 February 28

Lecture 5 Channel Coding over Continuous Channels

Solutions to Set #2 Data Compression, Huffman code and AEP

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

Interactive Decoding of a Broadcast Message

EE 4TM4: Digital Communications II. Channel Capacity

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science Transmission of Information Spring 2006

Lecture 8: Shannon s Noise Models

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

The Method of Types and Its Application to Information Hiding

Interference Channel aided by an Infrastructure Relay

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Shannon s Noisy-Channel Coding Theorem

Second-Order Asymptotics in Information Theory

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7

Covert Communication with Channel-State Information at the Transmitter

An introduction to basic information theory. Hampus Wessman

for some error exponent E( R) as a function R,

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Lecture 12. Block Diagram

10-704: Information Processing and Learning Fall Lecture 9: Sept 28

Lecture 11: Quantum Information III - Source Coding

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)

Lecture 2. Capacity of the Gaussian channel

EE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet.

Lecture 15: Conditional and Joint Typicaility

On the Duality between Multiple-Access Codes and Computation Codes

Lecture 11: Continuous-valued signals and differential entropy

Intermittent Communication

On the Capacity of the Two-Hop Half-Duplex Relay Channel

CSCI 2570 Introduction to Nanocomputing


Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye

Information Theory and Statistics, Part I

MATH Examination for the Module MATH-3152 (May 2009) Coding Theory. Time allowed: 2 hours. S = q

On the Capacity of the Binary-Symmetric Parallel-Relay Network

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1

4 An Introduction to Channel Coding and Decoding over BSC

Error Correcting Codes: Combinatorics, Algorithms and Applications Spring Homework Due Monday March 23, 2009 in class

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Principles of Coded Modulation. Georg Böcherer

Shannon s Noisy-Channel Coding Theorem

EE5585 Data Compression May 2, Lecture 27

Communications Theory and Engineering

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Distributed Source Coding Using LDPC Codes

Quiz 1 Date: Monday, October 17, 2016

EE 376A: Information Theory Lecture Notes. Prof. Tsachy Weissman TA: Idoia Ochoa, Kedar Tatwawadi

Transcription:

National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed: 3.0 hours INSTRUCTIONS FOR CANDIDATES: ˆ This paper contains FOUR (4) questions, printed on FIVE (5) pages. ˆ Answer all questions. ˆ Programmable calculators are NOT allowed. ˆ Electronic communicating devices MUST be turned off and inaccessible throughout the examination. They CANNOT be used as calculators, timers or clocks. ˆ You are allowed to bring SEVEN (7) HANDWRITTEN A4 size sheets. You may write on both sides of the sheet. No other material is allowed.

EE5139R Communication Systems Page 2 of 5 Problem 1 1(a) (7 points) State whether the following statement is TRUE or FALSE: There exists a discrete memoryless channel (DMC) with a binary (i.e., X = 2 symbols) input alphabet and a ternary (i.e., Y = 3 symbols) output alphabet such that its capacity is equal to C = 1.5 bits/channel use. Please justify your answer carefully. 1(b) (8 points) Prove a list decoding version of Fano s inequality. In particular, show that if (W, Y ) W Y are jointly distributed as P W,Y, and L(Y ) is a subset of W of size l 1, the probability of error P e := Pr(W / L(Y )) can be lower bounded as P e H(W L(Y )) 1 log l log W l. l Hint: Define an error random variable E which takes the value of 1 when W / L(Y ) and 0 otherwise. Then expand H(W, E L(Y )) in two different ways using the chain rule for joint entropy. 1(c) (3 points) For a DMC P Y X (y x) over n channel uses, find an upper bound for 1 n I(Xn ; Y n ) in terms of the channel capacity C = max PX I(X; Y ). State where you used the assumption of memorylessness in your derivations. 1(d) (7 points) A (2 nr, 2 nl, n)-list code for a DMC P Y X (y x) with capacity C consists of an encoder that assigns a codeword x n (w) to each message w W := 1,..., 2 nr } and a decoder that upon receiving y n tries to finds the list of messages L(y n ) W = 1,..., 2 nr } of size L(y n ) 2 nl that contains the transmitted message. An error occurs if the list does not contain the transmitted message W, i.e., P (n) e := Pr(W / L(Y n )). A rate-list exponent pair (R, L) is said to be achievable if there exists a sequence of (2 nr, 2 nl, n)-list codes with P (n) e 0 as n. Show using parts (b) and (c) that there exists an upper bound on R called R + such that every sequence of (2 nr, 2 nl, n) list codes with P e (n) 0 (i.e., every (R, L) pair that is achievable) must satisfy R R +. Find this upper bound R + in terms of C and L.

EE5139R Communication Systems Page 3 of 5 Problem 2 2(a) (10 points) Consider the Gaussian channel shown above, in which the transmitted signal X with E[X 2 ] = P is received by two antennas with Y 1 = X + Z 1, Y 2 = X + Z 2 where Z 1 and Z 2 are independent, and E[Zi 2] = σ2 i with σ2 1 < σ2 2. Moreover, the signals at the two antennas are combined as Y = αy 1 + (1 α)y 2 before decoding where 0 α 1. Find the capacity of the channel for a given α. Please provide the units of capacity. 2(b) (10 points) The following parts are separate from part (a). Now assume that you have two parallel Gaussian channels with inputs X 1 and X 2 and noises Z 1 and Z 2, respectively. Assume that the noise powers are E[Z1 2] = 0.3 and E[Z2 2 ] = 0.6, while the total available power is P = E[X1 2] + E[X2 2 ] = 0.1. Find the optimal power allocation (waterfilling) and corresponding capacity. Please provide the units of capacity. Hint: You may assume the waterfilling conditions derived in class (without proving them again). Separately consider the cases P 1, P 2 > 0 and one of the powers is 0. 2(c) (5 points) Now let S n be a binary memoryless source (BMS) with entropy 0.15 bits per source symbol. Is it possible to transmit S n over the channel in part (b)? If it is possible, please explain how to achieve sending the BMS over the Gaussian channel. If it is not possible, explain clearly why not. Hint: Even if you didn t get part (b), you can still do (c) by assuming the capacity of the channel in part (c) is some value C > 0.

EE5139R Communication Systems Page 4 of 5 Problem 3 In this problem we use minimum distance decoding to establish achievability of the capacity for a Binary Symmetric Channel (BSC) with crossover probability p < 1/2. Define the Hamming distance d(x n, y n ) between two binary sequences x n, y n 0, 1} n as the number of positions where they differ, i.e., n n d(x n, y n ) = i 1, 2,..., n} : x i y i } = 1x i y i } = x i y i Here x y = 0 if x = y and x y = 1 if x y. We fix a rate R and generate 2 nr codewords X n (w) each from the uniform distribution on 0, 1} n so each X i (w) is uniform on 0, 1}, i.e., Pr(X i (w) = 0) = Pr(X i (w) = 1) = 1/2. The random codebook is C = X n (1), X n (2)..., X n (2 nr )}. For any codebook, the decoder chooses message ŵ such that the Hamming distance of X n (ŵ) to the channel output Y n is minimized, i.e., d(x n (ŵ), Y n ) < d(x n ( w), Y n ), for all w ŵ. Advice: Each part below is designed so that it may be done independently of earlier parts. So if you get stuck, move on. 3(a) (2 points) Assume that message W = 1 was sent. Argue that the probability of error averaged over the random code is P e (n) := Pr ( A ) 2nR W = 1, where A := w=2 i=1 i=1 d(x n (w), Y n ) d(x n (1), Y n ) } Hint: How does an error occur under the minimum distance decoding rule above? 3(b) (4 points) Recall the law of total probability Pr(A) = Pr(A, E) + Pr(A, E c ). Let Now show the upper bound E := d(x n (1), Y n ) > n(p + ɛ)}, for some 0 < ɛ < 1 2 p P (n) e Pr ( d(x n (1), Y n ) > n(p + ɛ) W = 1 ) + 2 nr Pr ( d(x n (2), Y n ) n(p + ɛ) W = 1 ) (1) Hint: Use Bayes rule and the fact that probabilities are no larger than one. 3(c) (2 points) Let Z i = Y i X i (1). What is the distribution of Z i? 3(d) (4 points) Write the first term on the right-hand-side of (1), namely Pr(d(X n (1), Y n ) > n(p + ɛ) W = 1) in terms of the Z i random variables of part (c) and argue using Chebyshev s inequality that this first term tends to zero as n tends to infinity. 3(e) (2 points) Let U i = Y i X i (2). What is the distribution of U i? 3(f) (8 points) Write the second probability in (1), namely Pr ( d(x n (2), Y n ) n(p + ɛ) W = 1 ) in terms of the U i random variables of part (e) and hence show using Sanov s theorem that Pr ( d(x n (2), Y n ) n(p + ɛ) W = 1 ) (n + 1) 2 2 nd(p+ɛ 1 2 ) where D(p q) is the relative entropy between a Bernoulli-p and Bernoulli-q random variable. 3(g) (3 points) Using the above, show that the capacity of the BSC with crossover probability p is at least 1 H b (p) where H b (p) = p log p (1 p) log(1 p) is the entropy of a Bernoulli-p random variable. Hint: Use parts (d) and (f) to show that P (n) e goes to zero under some condition on R.

EE5139R Communication Systems Page 5 of 5 Problem 4 Let X n := (X 1, X 2,..., X n ) be independent (but not identically distributed) discrete random variables on a finite set X, drawn according to the following rule: P1 (x) i odd Pr(X i = x) = P 2 (x) i even (2) Let Q be any other distribution on X (not the same as P 1 or P 2 ). Let r(n) be the remainder when you divide n by 2, i.e., r(n) = 0 if n is even and 1 otherwise. 4(a) (7 points) Fix ɛ > 0. Define the relative entropy typical set A (n) ɛ (P 1, P 2 Q) := x n : D ɛ < 1 n log P } 1(x 1, x 3,..., x n r(n)+1 )P 2 (x 2, x 4,..., x n r(n) ) < D + ɛ Q(x 1, x 2,..., x n ) for some D > 0. Note that 1, 3,..., n r(n) + 1 is simply the collection of odd numbers up to and including n. Similarly, 2, 4,..., n r(n) is the collection of even numbers up to and including n. Under hypothesis H 0, let X n be distributed as in (2). Find D in terms of P 1, P 2, Q such that ( Pr X n A (n) ɛ (P 1, P 2 Q) ) H 0 1 Hint: By independence, P 1 (X 1, X 3,..., X n r(n)+1 ) = P 1 (X 1 )P 1 (X 3 )... P 1 (X n r(n)+1 ) and similarly for the other probabilities where X n follows (2). Apply the weak law of large numbers or Chebyshev s inequality. 4(b) (10 points) Under hypothesis H 1, let X n be distributed i.i.d. Q. Find the largest E such that E satisfies ( Pr X n A (n) ɛ (P 1, P 2 Q) ) H 1 2 ne for every integer n 1. The number E should be stated in terms of D and ɛ. 4(c) (8 points) Now consider the binary hypothesis test H 0 : X n according to (2) H 1 : X n i.i.d. Q Let A n X n be an acceptance region for hypothesis H 1 and let the probabilities of error be α n (A n ) := Pr(A c n H 0 ), β n (A n ) := Pr(A n H 1 ). Define Find an upper bound for in terms of P 1, P 2, Q and ɛ. β ɛ n := min A n X n :α n(a n)<ɛ β n 1 lim n n log βɛ n END OF PAPER