Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding

Similar documents
Solutions to Homework Set #3 Channel and Source coding

ECE 534 Information Theory - Midterm 2

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 3: Channel Capacity

Formal Modeling in Cognitive Science Lecture 29: Noisy Channel Model and Applications;

Improved Capacity Bounds for the Binary Energy Harvesting Channel

Convolutional Codes. Lecture 13. Figure 93: Encoder for rate 1/2 constraint length 3 convolutional code.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

On Code Design for Simultaneous Energy and Information Transfer

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Shannon s noisy-channel theorem

On the capacity of the general trapdoor channel with feedback

I - Information theory basics

Training sequence optimization for frequency selective channels with MAP equalization

EE/Stats 376A: Information theory Winter Lecture 5 Jan 24. Lecturer: David Tse Scribe: Michael X, Nima H, Geng Z, Anton J, Vivek B.

Convex Optimization methods for Computing Channel Capacity

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

(Classical) Information Theory III: Noisy channel coding

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

ECE Information theory Final (Fall 2008)

Coding Along Hermite Polynomials for Gaussian Noise Channels

q-ary Symmetric Channel for Large q

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Note that the new channel is noisier than the original two : and H(A I +A2-2A1A2) > H(A2) (why?). min(c,, C2 ) = min(1 - H(a t ), 1 - H(A 2 )).

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

An Introduction to Information Theory: Notes

9 THEORY OF CODES. 9.0 Introduction. 9.1 Noise

LDPC codes for the Cascaded BSC-BAWGN channel

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

LECTURE 10. Last time: Lecture outline

Reliable Computation over Multiple-Access Channels

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

X 1 : X Table 1: Y = X X 2

HW Solution 3 Due: July 15

Solved Problems. (a) (b) (c) Figure P4.1 Simple Classification Problems First we draw a line between each set of dark and light data points.

Lecture 4 Noisy Channel Coding

Solutions to Homework Set #4 Differential Entropy and Gaussian Channel

1-way quantum finite automata: strengths, weaknesses and generalizations

Anytime communication over the Gilbert-Eliot channel with noiseless feedback

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Outline. EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs) Simple Error Detection Coding

Elementary Analysis in Q p

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

ECE Information theory Final

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes.

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Chapter I: Fundamental Information Theory

Interactive Hypothesis Testing Against Independence

EE 4TM4: Digital Communications II. Channel Capacity

Characterizing the Behavior of a Probabilistic CMOS Switch Through Analytical Models and Its Verification Through Simulations

ECS 452: Digital Communication Systems 2015/2. HW 1 Due: Feb 5

for some error exponent E( R) as a function R,

Energy State Amplification in an Energy Harvesting Communication System

LECTURE 13. Last time: Lecture outline

Principles of Communications

Chapter 1 Fundamentals

Feedback-error control

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

ELEC546 Review of Information Theory

Can Feedback Increase the Capacity of the Energy Harvesting Channel?

Use of Transformations and the Repeated Statement in PROC GLM in SAS Ed Stanek

Chapter 20: Exercises: 3, 7, 11, 22, 28, 34 EOC: 40, 43, 46, 58

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

Lecture 9: Connecting PH, P/poly and BPP

4 An Introduction to Channel Coding and Decoding over BSC

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Exercise 1. = P(y a 1)P(a 1 )

John Weatherwax. Analysis of Parallel Depth First Search Algorithms

Network coding for multicast relation to compression and generalization of Slepian-Wolf

HetNets: what tools for analysis?

Lecture 15: Conditional and Joint Typicaility

Improving on the Cutset Bound via a Geometric Analysis of Typical Sets

1 Introduction to information theory

Solution sheet ξi ξ < ξ i+1 0 otherwise ξ ξ i N i,p 1 (ξ) + where 0 0

An Analysis of Reliable Classifiers through ROC Isometrics

Approximating min-max k-clustering

MATHEMATICAL MODELLING OF THE WIRELESS COMMUNICATION NETWORK

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

1 Gambler s Ruin Problem

Information Theory - Entropy. Figure 3

INTRODUCTION. Please write to us at if you have any comments or ideas. We love to hear from you.

Revision of Lecture 5

CSE 599d - Quantum Computing When Quantum Computers Fall Apart

ON POLYNOMIAL SELECTION FOR THE GENERAL NUMBER FIELD SIEVE

Homework Set #2 Data Compression, Huffman code and AEP

GOOD MODELS FOR CUBIC SURFACES. 1. Introduction

Solution: (Course X071570: Stochastic Processes)

Solutions to Homework Set #2 Broadcast channel, degraded message set, Csiszar Sum Equality

Lecture 22: Final Review

An Attack on a Fully Homomorphic Encryption Scheme

GIVEN an input sequence x 0,..., x n 1 and the

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Lecture 8: Shannon s Noise Models

page 1 This question-paper you hand in, together with your solutions. ================================================================


Upper Bounds on the Capacity of Binary Intermittent Communication

Covert Communication with Channel-State Information at the Transmitter

Transcription:

Homework Set # Rates definitions, Channel Coding, Source-Channel coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits er channel use) that you send. (b) Source coding Rate: Assuming you have a file with 6 Ascii characters, where the alhabet of Ascii characters is 56. Assume that each Ascii character is reresented by bits (binary alhabet before comression). After comressing it we get 4 6 bits. What is the comression rate?. Prerocessing the outut. One is given a communication channel with transition robabilities (y x) and channel caacity C = max (x) I(X;Y). A helful statistician rerocesses the outut by forming Ỹ = g(y), yielding a channel (ỹ x). He claims that this will strictly imrove the caacity. (a) Show that he is wrong. (b) Under what conditions does he not strictly decrease the caacity?. The Z channel. The Z-channel has binary inut and outut alhabets and transition robabilities (y x) given by the following matrix: [ ] Q = x,y {,} / / Find the caacity of the Z-channel and the maximizing inut robability distribution. 4. Using two channels at once. Consider two discrete memoryless channels (X,(y x ),Y ) and (X,(y x ),Y ) with caacities C and C resectively. A new channel (X X,(y x ) (y x ),Y Y ) is formed in which x X and x X, are simultaneously sent, resulting in y,y. Find the caacity of this channel.

5. A channel with two indeendent looks at Y. Let Y and Y be conditionally indeendent and conditionally identically distributed given X. Thus (y,y x) = (y x)(y x). (a) Show I(X;Y,Y ) = I(X;Y ) I(Y ;Y ). (b) Conclude that the caacity of the channel X (Y,Y ) is less than twice the caacity of the channel X Y 6. Choice of channels. Find the caacity C of the union of channels (X, (y x ),Y ) and (X, (y x ),Y ) where, at each time, one can send a symbol over channel or over channel but not both. Assume the outut alhabets are distinct and do not intersect. (a) Show C = C + C. (b) What is the caacity of this Channel? 7. Cascaded BSCs. Considerthetwodiscretememorylesschannels(X, (y x),y)and(y, (z y),z). Let (y x) and (z y) be binary symmetric channels with crossover robabilities λ and λ resectively.

X λ λ λ λ λ λ λ λ Y Z (a) What is the caacity C of (y x)? (b) What is the caacity C of (z y)? (c) Wenowcascadethesechannels. Thus (z x) = y (y x) (z y). What is the caacity C of (z x)? Show C min{c,c }. (d) Now let us actively intervene between channels and, rather than assively transmitting y n. What is the caacity of channel followed by channel if you are allowed to decode the outut y n of channel and then reencode it as ỹ n for transmission over channel? (Think W x n (W) y n ỹ n (y n ) z n Ŵ.) (e) What is the caacity of the cascade in art c) if the receiver can view both Y and Z? 8. Channel caacity (a) What is the caacity of the following channel in Fig. (aears on the next age). (b) Provide a simle scheme that can transmit at rate R = log bits through this channel.

(Inut) X Figure : A channel 4 Y (Outut) 9. A channel with a switch. Consider the channel that is deicted in Fig., there are two channels with the conditional robabilities (y x) and (y x). These two channels have common inut alhabet X and two disjoint outut alhabets Y,Y (a symbol that aears in Y can t aear in Y ). The osition of the switch is determined by R.V Z which is indeendent of X, where Pr(Z = ) = λ. (y x) Y Z = X Y (y x) Y Z = Figure : The channel. (a) Show that I(X;Y) = λi(x;y )+ λi(x;y ). () (b) The caacity of this system is given by C = max (x) I(X;Y). Show that C λc + λc, () 4

where C i = max (x) I(X;Y i ). When is equality achieved? (c) The sub-channels defined by (y x) and (y x) are now given in Fig.9, where =. Find the inut robability (x) that maximizes I(X; Y). For this case, does the equality stand in Eq. ()? exlain! (a) X Y (b) X Y Figure : (a) describes channel - BSC with transition robability. (b) describes channel - Z channel with transition robability.. Channel with state A discrete memoryless (DM) state deendent channel with state sace S is defined by an inut alhabet X, an outut alhabet Y and a set of channel transition matrices {(y x,s)} s S. Namely, for each s S the transmitter sees a different channel. The caacity of such a channel where the state is know causally to both encoder and decoder is given by: C = maxi(x;y S). () (x s) Let S = and the three different channels (one for each state s S) are as illustrated in Fig.4 The state rocess is i.i.d. according to the distribution (s). (a) Find an exression for the caacity of the S-channel (the channel the transmitter sees given S = ) as a function of ǫ. (b) Find an exression for the caacity of the BSC (the channel the transmitter sees given S = ) as a function of δ. 5

ǫ ǫ δ δ ǫ δ δ ǫ S = S = S = Figure 4: The three state deendent channel. (c) Find an exression for the caacity of the Z-channel (the channel the transmitter sees given S = ) as a function of ǫ. (d) Find an exression for the caacity of the DM state deendent channel (using formula ()) for (s) = [ ] as a function of ǫ 6 and δ. (e) LetusdefineaconditionalrobabilitymatrixP X S fortworandom variables X and S with X = {,} and S = {,,}, by: [ ], PX S = (x = j s = i). (4) i=,j= WhatistheinutconditionalrobabilitymatrixP X S thatachieves the caacity you have found in (d)?. Modulo Channel (a) Consider the DMC defined as follows: Outut Y = X Z where X, taking values in {,}, is the channel inut, is the modulo- summation oeration, and Z is binary channel noise uniform over {,} and indeendent of X. What is the caacity of this channel? (b) Consider the channel of the revious art, but suose that instead of modulo- addition Y = X Z, we erform modulo- addition Y = X Z. Now what is the caacity? 6

. Cascaded BSCs: Given is a cascade of k identical and indeendent binary symmetric channels, each with crossover robability α. (a) In the case where no encoding or decoding is allowed at the intermediate terminals, what is the caacity of this cascaded channel as a function of k,α. (b) Now, assume that encoding and decoding is allowed at the intermediate oints, what is the caacity as a function of k,α. (c) What is the caacity of each of the above settings in the case where the number of cascaded channels, k, goes to infinity? 7