ECE Information theory Final

Similar documents
ECE Information theory Final (Fall 2008)

Lecture 22: Final Review

Note that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )).

National University of Singapore Department of Electrical & Computer Engineering. Examination for

X 1 : X Table 1: Y = X X 2

Exercise 1. = P(y a 1)P(a 1 )

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Revision of Lecture 5

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Chapter 9 Fundamental Limits in Information Theory

Lecture 1. Introduction

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Noisy channel communication

Revision of Lecture 4

Capacity of a channel Shannon s second theorem. Information Theory 1/33

EE5585 Data Compression May 2, Lecture 27

Basic Principles of Video Coding

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Distributed Source Coding Using LDPC Codes

Lecture 3: Channel Capacity

Lecture 8: Channel Capacity, Continuous Random Variables

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Lecture 2. Capacity of the Gaussian channel

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Reliable Computation over Multiple-Access Channels

ECE 534 Information Theory - Midterm 2

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Distributed Lossless Compression. Distributed lossless compression system

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

Lecture 20: Quantization and Rate-Distortion

Principles of Communications


EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Shannon s noisy-channel theorem

Lecture 5 Channel Coding over Continuous Channels

Lecture 8: Shannon s Noise Models

Lecture 2: August 31

Assignment 9. Due: July 14, 2017 Instructor: Dr. Mustafa El-Halabi. ! A i. P (A c i ) i=1

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals

Lecture 10: Broadcast Channel and Superposition Coding

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Lecture 4 Noisy Channel Coding

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Quantization for Distributed Estimation

One Lesson of Information Theory

Solutions to Homework Set #3 Channel and Source coding

Lecture 14 February 28

CS 630 Basic Probability and Information Theory. Tim Campbell

Universal Incremental Slepian-Wolf Coding

Appendix B Information theory from first principles

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

ECE 4400:693 - Information Theory

Problem 7.7 : We assume that P (x i )=1/3, i =1, 2, 3. Then P (y 1 )= 1 ((1 p)+p) = P (y j )=1/3, j=2, 3. Hence : and similarly.

Information Theory Primer:

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

Entropies & Information Theory

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Shannon Information Theory

On the Capacity of the Two-Hop Half-Duplex Relay Channel

On Capacity Under Received-Signal Constraints

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

ELEMENT OF INFORMATION THEORY

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Dept. of Linguistics, Indiana University Fall 2015

Unequal Error Protection Querying Policies for the Noisy 20 Questions Problem

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

16.36 Communication Systems Engineering

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Chapter 4: Continuous channel and its capacity

On the Duality between Multiple-Access Codes and Computation Codes

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel

Capacity of channel with energy harvesting transmitter

Quiz 2 Date: Monday, November 21, 2016

Information Dimension

Advanced Topics in Information Theory

Communication Theory II

ELEC546 Review of Information Theory

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Multi-Input Multi-Output Systems (MIMO) Channel Model for MIMO MIMO Decoding MIMO Gains Multi-User MIMO Systems

CSCI 2570 Introduction to Nanocomputing

A Practical and Optimal Symmetric Slepian-Wolf Compression Strategy Using Syndrome Formers and Inverse Syndrome Formers

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7

Remote Source Coding with Two-Sided Information

The Gallager Converse

Lecture 11: Polar codes construction

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

Multiple Description Coding for quincunx images.

Digital communication system. Shannon s separation principle

Channel Coding I. Exercises SS 2017

Interactive Decoding of a Broadcast Message

Transcription:

ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the distortion D = 2 /12 is equal to 01 and then use entropy encoding on the quantized samples X i In the second strategy, we use optimal rate-distortion coding with distortion D =01 Compare the number of bits per symbol needed for the two strategies (in particular, for the first use results valid in the limit of a small ) Sol: For the first strategy, we have D = 2 /12 = 01 sothat = 12 =11 R = H(X )= log 2 + h(x) = log 2 (11) + 1 2 log 2(2πe) =191 bits/symbol, while for the second R(01) = 1 µ 1 2 log 2 = 166 bits/symbol 01 Clearly, the optimal code according to rate-distortion theory requires a smaller number of bits/symbol Q2 (1 point) Find the rate-distortion function R(D) for a Bernoulli source X Ber(06) with distortion 0 x =ˆx d(x, ˆx) = x =1, ˆx =0 1 x =0, ˆx =1 Sol: In order to obtain an average distortion D, we should have D = X x,ˆx p(x, ˆx)d(x, ˆx) It is easy to see that there exists only one joint pmf p(x, ˆx) thatsatisfies the above conditions and the constraint on the marginal pmf of X stated in the text: X/ ˆX 0 1 0 04 D D 1 0 06 To see this, notice that with this choice the following conditions are satisfied: D = X x,ˆx p(x, ˆx)d(x, ˆx) =p(0, 1) p X (1) = 06 The rate-distortion function can be then obtained for D<04 as R(D) = I(X; ˆX) =H(X) H(X ˆX) =H(06) p ˆX(1)H(X ˆX =1)= = H(06) (D +06)H(D/(D +06)) 1

Notice that for D>04, R(D) = 0 Q3 (1 point) Assume that you have two parallel Gaussian channels with inputs X 1 and X 2 and noises Z 1 and Z 2, respectively Assume that the noise powers are E[Z 2 1]=03 and E[Z 2 2]=06, while the total available power is P = E[X 2 1]+E[X 2 2]=01 Find the optimal power allocation (waterfilling) and corresponding capacity Sol: The waterfilling conditions are: with It follows that P 1 = E[X 2 1]=(μ 03) + P 2 = E[X 2 2]=(μ 06) + P 1 + P 2 =01 (μ 03) + +(μ 06) + =01 If you assume both P 1 and P 2 > 0, it is easy to see that there is no solution to the previous equation Instead, if we set P 2 =0, we get (μ 03) = 01, from which we obtain μ =04 and P 1 = (04 03) + =01 P 2 = (04 06) + =0, ie, all the power is used on the best channel The corresponding capacity is then 2 log 2 1+ 01 = 021 bits/symb 03 Q4 (1 point) Two sensors collect information about the temperature of a given process in different locations At each time instant, they both need to communicate whether the temperature of the process is above a given threshold (alarm) or not (normal) Since the sensors are in adjacent areas, their measurements U and V are correlated according to the joint PMF U\V normal alarm normal 08 005 alarm 005 01 How many bits R 1 and R 2 are needed to deliver this information if U and V are encoded independently? What if we use Slepian-Wolf source coding? Sol: If U and V are decoded independently we need R 1 H(U) =H (085) = 061 R 2 H(V )=061, 2

and a total number of bits R 1 + R 2 061 + 061 = 122 On the contrary, if we use Slepian-Wolf coding, we only need R 1 µ µ 08 005 H(U V )=085 H +015 H =041 085 015 R 2 H(V U) =041 R 1 + R 2 H(U, V )= 2 005 log 2 (005) 01 log 2 (01) 08 log 2 (08) = 102 so that the total number of bits needed is only R 1 + R 2 102 provided that R 1 032 and R 2 032 P1 (2 points) We would like to transmit a Bernoulli source X i Ber(07) with Hamming distortion D over a binary symmetric channel (BSC) with probability of error p (a) Assuming that we send one source symbol for each channel symbol, can the source X i be transmitted losslessly (D = 0)over the BSC if p = 03? Sol: The condition to be verified is that R(0) = H(X) C =1 H(p) However, since H(X) =H(07) = 088 bits/ source symbol and C =1 H(03) = 012 bits/ channel symbol, the condition does not hold true, and, as a result, the source cannot be transmitted over the channel (b) In the same condition as at point (a), can we send source X i with distortion D =04? Sol: Since D =04 >min(07, 03) = 03, we have that the corresponding rate is R(04) = 0 bits/ source symbol, so that the source can definitely be transmitted over the channel (c) Going back to the case D = 0, what is the largest rate r=source symbol/channel symbol thatwecantransmitoverthechannel? Sol: The condition reads H(07) r C r (d) Repeat (c) with D =01 Sol: We have C H(07) = 012 =014 source symb/ channel symb 088 R(01) r C r C R(01) = 012 H(07) H(01) = 012 = =029 source symb/ channel symb 088 047 P2 (2 points) Consider the Gaussian channel in the figure, in which the transmitted signal X (E[X 2 ]=P ) is received by two antennas with Y 1 = X + Z 1 Y 2 = X + Z 2 3

X Z 1 Y 1 α Y Y 2 1 α Z 2 Figure 1: where Z 1 and Z 2 are independent, and E[Z 2 i ]=N i with N 1 <N 2 Moreover, the signal at the two antennas is combined as Y = αy 1 +(1 α)y 2 before detection (0 α 1) a) Find the capacity of the channel for a given α Sol:Sincewehave Y = αy 1 +(1 α)y 2 = X + αz 1 +(1 α)z 2, the channel at hand is a Gaussian channel with signal power P and noise power α 2 N 1 +(1 α) 2 N 2 and the corresponding capacity is 2 log P 2 1+ α 2 N 1 +(1 α) 2 N 2 b) Using your result at the previous point, find the optimal α that maximizes the capacity and write down the corresponding maximum capacity Sol: Maximizing the capacity C implies minimizing the noise power α 2 N 1 +(1 α) 2 N 2, which is a convex function of α, so that the optimality condition is d dα (α2 N 1 +(1 α) 2 N 2 )=0 It follows that the optimal α reads α N 2 =, N 1 + N 2 that is, proportionally more weight is given to the less noisy received signal The corresponding capacity is 2 log P 2 1+ = α 2 N 1 +(1 α ) 2 N 2 = 1 2 log 2 µ1+ PN1 + PN2 4

P3 (2 points) You are given a discrete memoryless multiple access channel (MAC) with inputs X 1 {0, 1, 2} and X 2 {0, 1, 2} and output Y = X 1 + X 2 a) Assuming that both X 1 and X 2 are chosen uniformly and independently in their range, findtheachievablerateregionofthemac Sol: For fixed pmfs of the inputs X 1 and X 2,p(x 1 )=u(x 1 )andp (x 2 )=u(x 2 ), we have R 1 I(X 1 ; Y X 2 )=H(Y X 2 ) H(Y X 1,X 2 )=H(Y X 2 )=H(X 1 )=log 2 3=158 R 2 I(X 2 ; Y X 2 )=H(X 2 )=log 2 3=158 R 1 + R 2 I(X 1,X 2 ; Y )=H(Y) H(Y X 1,X 2 )=H(Y) In order to evaluate H(Y )weneedtoevaluatethepmfofy as follows: p X1 (0)p X2 (0) = 1/9 y =0 2 p X1 (1)p X2 (0) = 1/9 y =1 p (y) = 3 1/9 y =2 2 1/9 y =3 1/9 y =4 so that H(Y )= 2 1 9 log 2 µ 1 2 2 9 9 log 2 µ 2 1 9 3 log 2 µ 1 =22 3 b) Assume now that the two transmitters can cooperate for transmission over this MAC What is the capacity region (notice that in this case maximization over the input distribution is required)? Sol: In this case we can achieve R 1 + R 2 I(X 1,X 2 ; Y )=H(Y ) for any joint pmf p(x 1,x 2 ) In particular, we can optimize p(x 1,x 2 ) in order to maximize H(Y ) We notice that H(Y ) log 2 Y =log 2 5=232 with equality if p(y) =u(y) Can this value be achieved with the right choice of p(x 1,x 2 )? Observing that we must have p(y) = p(0, 0) = 1/5 y =0 p(0, 1) + p(1, 0) = 1/5 y =1 p(1, 1) + p(2, 0) + p(0, 2) = 1/5 y =2 p(2, 1) + p(1, 2) = 1/5 y =3 p(2, 2) = 1/5 y =4 we can conclude that it is enough to choose p(x 1,x 2 )as X 1 \X 2 0 1 2 0 1/5 1/10 1/15 1 1/10 1/15 1/10 2 1/15 1/10 1/5 We can then conclude that the capacity region with cooperation is characterized by R 1 +R 2 log 2 5=232 5,