Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Similar documents
Lecture 4 Noisy Channel Coding

Lecture 4 Channel Coding

Lecture 5 Channel Coding over Continuous Channels

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

ECE Information theory Final (Fall 2008)

Shannon s A Mathematical Theory of Communication

Lecture 1. Introduction

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

COMM901 Source Coding and Compression. Quiz 1

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

COS 341: Discrete Mathematics

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

ITCT Lecture IV.3: Markov Processes and Sources with Memory

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Lecture 20: Quantization and Rate-Distortion

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

EE 4TM4: Digital Communications II. Channel Capacity

Lecture 22: Final Review

Solutions to Homework Set #3 Channel and Source coding

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Quiz 2 Date: Monday, November 21, 2016

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

Reliable Computation over Multiple-Access Channels

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Optimality of Walrand-Varaiya Type Policies and. Approximation Results for Zero-Delay Coding of. Markov Sources. Richard G. Wood

ECE 534 Information Theory - Midterm 2

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

ECE Information theory Final

Homework Set #2 Data Compression, Huffman code and AEP


LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Digital communication system. Shannon s separation principle

Information Theory. M1 Informatique (parcours recherche et innovation) Aline Roumy. January INRIA Rennes 1/ 73

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

Lecture 8: Shannon s Noise Models

EE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet.

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

Capacity of a channel Shannon s second theorem. Information Theory 1/33

(Classical) Information Theory III: Noisy channel coding

UNIT I INFORMATION THEORY. I k log 2

Amobile satellite communication system, like Motorola s

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

Appendix B Information theory from first principles

Lecture 15: Conditional and Joint Typicaility

Mathematical Statistical Physics Solution Sketch of the math part of the Exam

Lecture 3: Channel Capacity

3F1 Information Theory, Lecture 3

Lecture 9 Polar Coding

Midterm Exam 1 Solution

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

X 1 : X Table 1: Y = X X 2

Information and Entropy

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

CSCI 2570 Introduction to Nanocomputing

Basic Principles of Video Coding

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

Lecture 11: Polar codes construction

Source-Channel Coding Theorems for the Multiple-Access Relay Channel

SOURCE coding problems with side information at the decoder(s)

Chapter 9 Fundamental Limits in Information Theory

Math 164-1: Optimization Instructor: Alpár R. Mészáros

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

Entropies & Information Theory

COS 341: Discrete Mathematics

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

The Information Lost in Erasures Sergio Verdú, Fellow, IEEE, and Tsachy Weissman, Senior Member, IEEE

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

ELEMENT OF INFORMATION THEORY

Exercise 1. = P(y a 1)P(a 1 )

Source and Channel Coding for Correlated Sources Over Multiuser Channels

Lecture 10: Broadcast Channel and Superposition Coding

Shannon s Noisy-Channel Coding Theorem

On Multiple User Channels with State Information at the Transmitters

Shannon s noisy-channel theorem

5 Mutual Information and Channel Capacity

On the Rate-Limited Gelfand-Pinsker Problem

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

Lecture 1: Shannon s Theorem

Covert Communication with Channel-State Information at the Transmitter

Optimal Natural Encoding Scheme for Discrete Multiplicative Degraded Broadcast Channels

Information Theory with Applications, Math6397 Lecture Notes from September 30, 2014 taken by Ilknur Telkes

Exercises with solutions (Set D)

The information loss in quantization

EE5585 Data Compression May 2, Lecture 27

Dispersion of the Gilbert-Elliott Channel

Equivalence for Networks with Adversarial State

Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

lossless, optimal compressor

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science Transmission of Information Spring 2006

Transcription:

Midterm Exam Time: 09:10 12:10 11/23, 2016 Name: Student ID: Policy: (Read before You Start to Work) The exam is closed book. However, you are allowed to bring TWO A4-size cheat sheet (single-sheet, two-sided). If you access to any other books, computing devices, internet connected devices, etc., it is regarded as cheating, and the exam will not be graded. Moreover, we will file the case to the University Office. No discussion is allowed during the exam. Everyone has to work on his/her own. Please turn in this copy (exam sheets) when you submit your solution sheets. Please follow the seat assignment when you are seated. Only those written on the solution sheets will be graded. Those written on the exam sheets will not be graded. You can use Mandarin to write your solutions. Note: (Read before You Start to Work) Partial points will be given even if you cannot solve the problem completely. Write down your derivation and partial solutions in a clear and systematic way. You can make any additional reasonable assumptions that you think are necessary in answering the questions. Write down your assumptions clearly. You should express your answers as explicit and analytic as possible. You can reuse any known results from our lectures (not including exercises in the lecture slides and materials in the referenced books) and homework problems without re-proving them. Other than those, you need to provide rigorous arguments, unless the problem mentions specifically. Total Points: 100. Good luck! 1

1. (Joint Source Channel Coding) [12] Consider a stationary and ergodic Bernoulli-q source {S i i N}, i.e., P{S i = 1} = q. We would like to use a memoryless binary symmetric channel (X, P Y X, Y) where X = Y = {0, 1} and [ ] 1 p p P Y X = p 1 p to send the source sequence to the destination. a) Find the maximum value of entropy rate over all possible stationary and ergodic Bernoulli-q sources. [3] Which stationary and ergodic Bernoulli-q source attains the maximum? [2] b) Suppose {S i } is a Markov process satisfying S i 1 S i S i+1 for all i and P S2 S 1 (1 0) = α. What is the maximum number of source symbols per channel use that can be losslessly reconstructed at the destination? [7] 2. (Binary Channel with Input Cost) [16] Consider a cost function over the binary alphabet defined as follows: b(x) = x, for x = 0, 1. Find the capacity-cost function C(B) of the binary channels with input cost function b ( ) defined above and the following channel transition matrices respectively. a) b) [ ] 1 p p P Y X = p 1 p P Y X = [ 1 p p 0 ] 0 p 1 p [8] [8] 2

3. (Message + List Decoding) [8] Consider a single-transmitter-two-receiver memoryless channel with a single input X and two outputs (Y 1, Y 2 ), where Y l = X + Z l, Z l N (0, σ 2 l ), l = 1, 2, and {X, Z 1, Z 2 } are mutually independent. Furthermore, σ 2 1 < σ 2 2, and the input X is subject to an average input power constraint P. W ENC X N Z N 1 Z N 2 Y N 1 Y N 2 DEC 1 DEC 2 Ŵ L Let N denote the blocklength and R denote the code rate. The transmitter has a message W {1, 2,..., 2 NR } to send. The goal of Receiver 1 is to decode the message W. The goal of Receiver 2 is to find a list L of size 2 NL so that the message W L. Hence, the error probability is defined as P (N) e } P {W Ŵ or W / L. The pair (R, L) is said to be achievable if there exists a sequence of codes described above such that lim N P (N) e = 0. Define C(L) sup {R (R, L) is achievable}. Extend the results in Homework 3 to find C(L). (Rigorous proofs are not required. However, please provide sufficient justification of your arguments.) 3

4. (Information Measures) [18] In this problem we consider probability distributions over positive integers N {1, 2,...} with finite supports, that is, each of them only takes non-zero probability at finite number of integers. Let us define the convolution of two such distributions P and Q, (P Q) : N [0, 1] in the usual way: (P Q)(x) P (u)q(x u), x N. u=1 a) Show that max {H (P ), H (Q )} (1) H (P Q ) (2) H (P ) + H (Q ). [8] When does (1) hold with equality? [2] When does (2) hold with equality? [2] b) Show that D (P 1 P 2 ) D (P 1 Q P 2 Q). [6] 4

5. (Source Coding) [23] Consider a discrete source {S i i N} taking values in a finite alphabet S. Consider a lossy source coding problem with the following setup: Reconstruction alphabet Ŝ = S Distortion measure: Hamming distance d (s, ŝ) = 1 {s ŝ}. i.i.d. a) Suppose the source {S i } is memoryless, that is, S i P S. By directly solving ( ) R(D = 0) = min I S ; Ŝ, PŜ S : E[d(S,Ŝ)] 0 show that R(0) = H (S ), where S P S. [5] b) Suppose the source {S i } is stationary and ergodic with entropy rate H ({S i } ). Let R(D) be the minimum compression ratio that can achieve [ )] lim sup E d (S N, ŜN D. N In the lecture we do not introduce the lossy source coding theorem for sources with memory, and hence we do not know how to compute R(D) in general. However, we are still able to show that R(0) H ({S i } ). [ ( )] } 1) Show that E d S, Ŝ P {S N ŜN. [5] 2) Show that R(0) H ({S i } ). [5] c) Suppose the source {S i } is memoryless and uniformly distributed over S. Find the rate distortion function R(D) by leveraging Fano s inequality. [8] 5

6. (Permutation Channel) [23] A channel model in neural communication is the following: Input alphabet: X = {0, 1} d Output alphabet: Y = {0, 1} d Channel law: 1/ ( ) d x, if y 1 = x P Y X (y x) = 1 1 0, otherwise (For a d-dimensional binary vector x, its l 1 -norm is the number of 1's in x.) In words, the channel permutes the length-d binary vector uniformly at random. a) Compute the channel capacity C of this channel. [4] What is the capacity achieving input distribution? [2] b) Suppose the input cost function is b (x) = x 1. Compute the capacity-cost function C(B) of this channel. [5] What is the capacity achieving input distribution? [4] c) Let α be a constant between 0 and 1, that is, 0 < α < 1. Now suppose the channel delivers x noiselessly with probability (1 α), and permutes x uniformly at random with probability α. (Note: keeping x the same is also one possible permutation.) Compute the channel capacity C of this channel (note: no input cost constraint). [5] What is the capacity achieving input distribution? [3] 6