ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

Similar documents
Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Entropy as a measure of surprise

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Lecture 4 Channel Coding

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

Uncertainity, Information, and Entropy

Lecture 4 Noisy Channel Coding

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code

Digital communication system. Shannon s separation principle

UNIT I INFORMATION THEORY. I k log 2

EE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet.

Communications Theory and Engineering

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Information and Entropy

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

Shannon-Fano-Elias coding

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

Exercise 1. = P(y a 1)P(a 1 )

1 Introduction to information theory

EECS 229A Spring 2007 * * (a) By stationarity and the chain rule for entropy, we have

Motivation for Arithmetic Coding

COMM901 Source Coding and Compression. Quiz 1

10-704: Information Processing and Learning Fall Lecture 10: Oct 3

Shannon s noisy-channel theorem

ECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220

EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16

3F1 Information Theory, Lecture 3

Coding for Discrete Source

DCSP-3: Minimal Length Coding. Jianfeng Feng

Information & Correlation

Data Compression Techniques

Homework Set #2 Data Compression, Huffman code and AEP

SIGNAL COMPRESSION Lecture Shannon-Fano-Elias Codes and Arithmetic Coding

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

LECTURE 13. Last time: Lecture outline

Data Compression Techniques (Spring 2012) Model Solutions for Exercise 2

EE 121: Introduction to Digital Communication Systems. 1. Consider the following discrete-time communication system. There are two equallly likely

Compression and Coding

Huffman Coding. C.M. Liu Perceptual Lab, College of Computer Science National Chiao-Tung University

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

CSCI 2570 Introduction to Nanocomputing

Source Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Lecture 1: Shannon s Theorem

Chapter 2 Source Models and Entropy. Any information-generating process can be viewed as. computer program in executed form: binary 0

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Chapter 9 Fundamental Limits in Information Theory

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

PART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015

Entropy Coding. Connectivity coding. Entropy coding. Definitions. Lossles coder. Input: a set of symbols Output: bitstream. Idea

EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018

Information Theory. Week 4 Compressing streams. Iain Murray,

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

Source Coding Techniques

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

for some error exponent E( R) as a function R,

Chapter 5: Data Compression

Lecture 15: Conditional and Joint Typicaility

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science Transmission of Information Spring 2006

10-704: Information Processing and Learning Fall Lecture 9: Sept 28

Shannon s Noisy-Channel Coding Theorem

Chapter 2: Source coding

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS


Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code

Lecture Notes on Digital Transmission Source and Channel Coding. José Manuel Bioucas Dias

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

3F1 Information Theory, Lecture 3

U Logo Use Guidelines

1590 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE Source Coding, Large Deviations, and Approximate Pattern Matching

4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak

EE 229B ERROR CONTROL CODING Spring 2005

Lecture 11: Polar codes construction

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)

(Classical) Information Theory III: Noisy channel coding

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7

Basic Principles of Lossless Coding. Universal Lossless coding. Lempel-Ziv Coding. 2. Exploit dependences between successive symbols.

lossless, optimal compressor

Lecture 3 : Algorithms for source coding. September 30, 2016

Exercises with solutions (Set B)

Lecture 8: Shannon s Noise Models

ECS 452: Digital Communication Systems 2015/2. HW 1 Due: Feb 5

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

16.36 Communication Systems Engineering

ECE 587 / STA 563: Lecture 5 Lossless Compression

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Transcription:

ECE 74 - Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE) 1. A Huffman code finds the optimal codeword to assign to a given block of source symbols. (a) Show that cannot be a Huffman code for any source distribution where every string to be coded has non-zero probability. for any (b) For a source producing an IID sequence of discrete random variables, each drawn from source alphabet, it has been found that a Huffman code on blocks of length 2 (i.e. source symbols are taken at a time) has rate bits/symbol, and a Huffman code on blocks of length 3 (i.e. source symbols are taken at a time) has rate bits/symbol. Find upper and lower bounds to the first-order entropy of the source. Find upper and lower bounds to the entropy rate of the source. What can be deduced about the size of the source alphabet? (c) Does there exist a source producing an IID sequence of discrete random variables, and integers! and "!, such that a Huffman code on blocks of length has (strictly) smaller rate (in bits/symbol) than a Huffman code on blocks of length "? * +, #%$'&%( *,,.- /. Suppose that I take blocks of bits at a time from this source as the input to my source coder. Find a Huffman code for this situation (i.e. a Huffman code that assigns to each 3-bit input sequence a variable-length output bit sequence.) Find the rate of your code from (a) in average output bits per input bit. Also, (do not forget this part), use the entropy of the source to show that the rate of your Huffman code falls between some easily obtained (but non-trivial) bounds. Your boss has read a little information theory and has fallen in love with Shannon-Fano coding. Find a Shannon-Fano code for the source (i.e. a Shannon-Fano code that assign to each 3-bit input sequence a variable-length output bit sequence) and verify that its rate is consistent with your answers from (a) and (b). Hint: The Shannon-Fano definition is non-constructive, but you should be able to find a code that meets the definition. 2. An independent and identically distributed (IID) binary source has #%$'&)( (b) Now, your boss comes to you with two pieces of good news: (i) There are no complexity limits (i.e. you can let your be as big as you want), and (ii) the source, which is stationary, is correlated. In fact, he tells you that, for any 0, the conditional probability of the 02143 symbol only depends on the previous symbol; that is, 7698;: 6=<?> 62@A> 69BC>EDEDED > 6 8F < $'G7HJI GLKM GON G7P G7HQJKR* S6T8: 6 8 F < $'GSH.I G7HQJKC* VVVU W XVVV Y G7H 4ZG7H E G H G7H G7HQJK G7HQJK G HQJK G7HQJK Using stationarity and this conditional probability, you can show that the marginal probabilities are still #%$'&[H * +, #%$'&\H *. Find the rate that I tell my boss to expect out of the source coder and compare it to your answer from (a).

I 3. Consider discrete random variables &) and. Suppose that & and are conditionally independent given as implied by the diagram below: & Are each of the following statements true or false? ( True means for all & = and and are conditionally independent given. False means there exists & = and, where & such that & and are conditionally independent given, such that the statement does not hold). Be sure to give justification for your answers. (a) 7$'& I * (b) $'& I * (c) $'& I = * $'& * (d) 7$'& * $ [* $'& I \* (e) $ I * $'& I * (f) 7$'& $ = * * 7$'& [* 4. A discrete-valued source outputs an independent and identically distributed (IID) sequence of random variables $'& *, each drawn from the alphabet.k N. I have not been able to find the probability mass function 6 $'GJ* ; however, I have been able to show that for any, there are sets K N P that have the following three properties: (i) is a subset of (ii) #%$ $'& K &[N & ('*) (iii) I P&%.. * *"! as!$#. What can be said about the entropy of the source?. The following is a formal restatement of our theorem for almost lossless fixed length source codes: Let $'&+ * be an IID sequence of discrete random variables with common distribution 6 $'G.* and firstorder entropy $'& *. (1) Then, for any,- $'& *, any./, there exists a fixed length source code with rate less than, and probability of decoding error #"0213.. (2) Conversely, for any,41 $'& * and any 6, there is a (large) integer 87 such that for all :9 ;7, a fixed length source code that operates on symbols at a time has probability of decoding error bounded as: #<02= "= >@? QBA > 6C ' % C

K N K By applying the Typical Sequences Theorem, show: (a) The positive statement (statement (1)). (b) The converse (statement (2)). (c) Show that for,-1 $'& *, # 0. 6. A source produces an IID sequence of discrete random variables, each drawn from source alphabet S, where the elements have respective probabilities N K, K, K, K. We want to encode this source with a fixed length almost lossless source code. Assume that a (very large number) source symbols are taken at a time and mapped into binary digits. (a) Assuming can be chosen arbitrarily large, find the infimum of the rates for which the decoding error probability can be made arbitrarily small. (b) Now suppose that I restrict each of the output sequences of binary digits to have at most 1 s. Assuming can be chosen arbitrarily large, estimate the infimum of the rates for which the decoding error probability can be made arbitrarily small (Yes, there is a nice closed form solution). 7. Consider two discrete memoryless channels K and N with capacities K and N, respectively. Assume that the output alphabet of K is equal to the input alphabet of 7N and that they are connected as: & K N Show that the capacity of the cascade of channels (i.e. the channel with input & than or equal to the minimum of K and N. and output ) is less 8. Find the capacity of the channel shown below: = N =

9. This is an old exam problem: (a) Find the capacity of the discrete memoryless channel (DMC) denoted by: 2= = 2= (b) Find the capacity of the discrete memoryless channel (DMC) denoted by: 2= = = (c) Find the capacity of the discrete memoryless channel (DMC) denoted by: where each of the transitions shown has probability K P.

10. Suppose that we are trying to guess & from an observation of with the decision rule #<0 #%$ & $ * & *. (a) Show that any optimum decision rule can never have #(0 greater than (b) Show that Fano s function : : $ * is maximized at : : QJK : :. (c) Suppose O$'& [* $'& *. Show that an optimal decision rule has # 0 : : QJK : :.. & $ \*. Let