LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Similar documents
LECTURE 13. Last time: Lecture outline

Network coding for multicast relation to compression and generalization of Slepian-Wolf

LECTURE 10. Last time: Lecture outline

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 22: Final Review

(Classical) Information Theory III: Noisy channel coding

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Lecture 4 Channel Coding

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Communications Theory and Engineering

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

LECTURE 2. Convexity and related notions. Last time: mutual information: definitions and properties. Lecture outline

Lecture 3: Channel Capacity

for some error exponent E( R) as a function R,

Shannon s noisy-channel theorem

Lecture 10: Broadcast Channel and Superposition Coding

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST

Lecture 5: Asymptotic Equipartition Property

X 1 : X Table 1: Y = X X 2

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Lecture 4 Noisy Channel Coding

16.36 Communication Systems Engineering

Lecture 7 September 24

EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16

Lecture 11: Polar codes construction

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

ELEC546 Review of Information Theory

CSCI 2570 Introduction to Nanocomputing

Homework Set #2 Data Compression, Huffman code and AEP

Massachusetts Institute of Technology. Problem Set 4 Solutions

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

Lecture 8: Shannon s Noise Models

(Classical) Information Theory II: Source coding

Lecture 11: Quantum Information III - Source Coding

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

10-704: Information Processing and Learning Fall Lecture 9: Sept 28

Shannon s Noisy-Channel Coding Theorem

EE5585 Data Compression May 2, Lecture 27

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

LECTURE 3. Last time:

Lecture 11: Continuous-valued signals and differential entropy

An introduction to basic information theory. Hampus Wessman

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science Transmission of Information Spring 2006

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

On Multiple User Channels with State Information at the Transmitters

Variable Rate Channel Capacity. Jie Ren 2013/4/26

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

COMM901 Source Coding and Compression. Quiz 1

Exercise 1. = P(y a 1)P(a 1 )

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

Lecture 5 Channel Coding over Continuous Channels

On the Secrecy Capacity of Fading Channels

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Network Coding on Directed Acyclic Graphs

The Method of Types and Its Application to Information Hiding

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

On the Duality between Multiple-Access Codes and Computation Codes

Chapter 2: Source coding


Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

UNIT I INFORMATION THEORY. I k log 2

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Approximate Capacity of Fast Fading Interference Channels with no CSIT

Lecture 15: Conditional and Joint Typicaility

ELEMENTS O F INFORMATION THEORY

On Source-Channel Communication in Networks

Chapter 9. Gaussian Channel

Shannon s A Mathematical Theory of Communication

Lecture 2: August 31

ECE Information theory Final (Fall 2008)

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Coding for Discrete Source

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Source Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria

Convergence of Random Variables Probability Inequalities

Information Theory. M1 Informatique (parcours recherche et innovation) Aline Roumy. January INRIA Rennes 1/ 73

Information and Entropy

Channel Coding for Secure Transmissions

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code

Approximately achieving the feedback interference channel capacity with point-to-point codes

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

lossless, optimal compressor

Gaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26

EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

A Simple Memoryless Proof of the Capacity of the Exponential Server Timing Channel

1 Background on Information Theory

6.02 Fall 2012 Lecture #1

EE 4TM4: Digital Communications II. Channel Capacity

19. Channel coding: energy-per-bit, continuous-time channels

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Transcription:

LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser Reading: Sct. 8.13.

Data compression Consider coding several symbols together C : X n D expected codeword length is x n X n P X n(xn )l(x n ) optimum satisfies H D (X n ) L H D (X n ) + 1 per symbol codeword length is H D (X n ) n L n H D(X n ) n + 1 n Thus, we have that the rate R is lower bounded by entropy

Channel coding Coding theorem: For any source of messages, any rate R below C is feasible, i.e achievable with low enough probability of error E codebooks,messages [P e ] 2 NE r(r,p X ) For the P X that yields capacity, E r (R, P X ) is positive for R < C also the maximum probability of error is upper bounded Converse: For any source of IID messages, any transmission at rate R above C is bounded away from 0 R 1 n + P e R + C

Joint source and channel coding Is H C enough? Should coding be done jointly or not?

Joint source channel coding Consider a sequence of input symbols V n = (V 1,... V n ) Assume that V n = (V 1,... V n ) satisfies the AEP We map the sequence onto a codeword X(V n ) The receiver receives Y (a vector) and creates an estimate V n = g(y ) of V n We consider the end-to-end probability of error: P n e = P (V n = V n )

Joint source channel coding (n) A ɛ is a typical set if it is the set of sequences in the set of all possible sequences v n V n with probability: 2 n(h(v )+ɛ) P V n (v n ) 2 n(h(v ) ɛ) Since we assume that our V sequence satisfies the AEP, for all ɛ, there exists a typical set that has probability (n) P r(a ɛ ) 1 ɛ and has cardinality upper bounded by 2n(H(V )+ɛ) We encode only those elements in the typical set, the ones outside will contribute ɛ to the probability of error

Joint source channel coding We require at most n(h(v ) + ɛ) bits to describe the elements of the typical set Let us create a set of messages from these sequences The maximum probability of any message being decoded in error is arbitrarily close to 0 as long as R = H(V ) + ɛ < C for all n LARGE ENOUGH Note: the size of the X may not be n, but it grows in n The receiver decodes to attempt to recover an element from the typical set Because we have assumed that the AEP is satisfied, we have that ɛ can be chosen to be as small as we want

Joint source channel coding The probability of error is upper bounded by the probability that we are not in the typical set, plus the probability that we are in the typical set but we have an error Thus, we have that for any upper bound ɛ on the probability of error, there exists a large enough n such that the probability of error is upper bounded by ɛ Note: the probability of error from not being in the typical set may be higher or lower than that from incorrect decoding Thus, we have shown the forward part of the theorem: for any V 1,..., V n that satisfies the AEP, there exists a source code and a channel code such that P n 0 as n e Moreover, the source coding and the channel coding may be done independently

Converse to the joint source channel coding theorem We need to show that a probability of error bounded that converges 0 as n implies that H(V ) < C Let us use Fano s inequality: H(V n V n ) 1 + np n log( V ) e Let us now use the definition of entropy rate: The entropy rate of a stochastic process V is 1 lim n n H(V n ) if it exists Equivalently, the entropy rate of a stochastic process is n 1 H(V n V n ) + 1 n I(V n ; V n )

Converse to the joint source channel coding theorem Hence, we have that 1 1 H(V ) n (1 + np e n log( V )) + I(V n ; V n ) n 1 1 = n + P e n log( V ) + I(V n ; V n ) n 1 1 n + P e n log( V ) + I(X; Y ) n from DPT 1 n + P e n log( V ) + C since Pe n 0, it must be that H(V ) C for the above to be true for all n

Robustness of joint source channel coding theorem and its converse The joint channel source coding theorem and its converse hold under very general conditions: - memory in the input - memory in channel state - multiple access - but not broadcast conditions

A brain teaser

MIT OpenCourseWare http://ocw.mit.edu 6.441 Information Theory Spring 2010 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.