ECS 452: Digital Communication Systems 2015/2. HW 1 Due: Feb 5

Size: px
Start display at page:

Download "ECS 452: Digital Communication Systems 2015/2. HW 1 Due: Feb 5"

Transcription

1 ECS 452: Digital Communication Systems 2015/2 HW 1 Due: Feb 5 Lecturer: Asst. Prof. Dr. Prapun Suksompong Instructions (a) Must solve all non-optional problems. (5 pt) (i) Write your first name and the last three digit of your student ID on the upper-right corner of every submitted page. (ii) For each part, write your explanation/derivation and answer in the space provided. (b) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (c) Late submission will be rejected. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1 (HW ). Consider the code {0, 01} (a) Is it nonsingular? (b) Is it uniquely decodable? (c) Is it prefix-free? 1-1

2 ECS 452 HW 1 Due: Feb /2 Problem 2 (HW ). Consider the random variable X whose support S X seven values: S X = {x 1, x 2,..., x 7 }. Their corresponding probabilities are given by contains (a) Find the entropy H(X). x x 1 x 2 x 3 x 4 x 5 x 6 x 7 p X (x) (b) Find a binary Huffman code for X. (c) Find the expected codelength for the encoding in part (b). 1-2

3 ECS 452 HW 1 Due: Feb /2 Problem 3 (HW ). Find the entropy and the binary Huffman code for the random variable X with pmf { x, x = 1, 2,..., 6, p X (x) = 21 0, otherwise. Also calculate E [l(x)] when Huffman code is used. Problem 4 (HW ). These codes cannot be Huffman codes. Why? (a) {00, 01, 10, 110} (b) {01, 10} (c) {0, 01} 1-3

4 ECS 452 HW 1 Due: Feb /2 Problem 5 (HW ). A memoryless source emits two possible message Y(es) and N(o) with probability 0.9 and 0.1, respectively. (a) Determine the entropy (per source symbol) of this source. (b) Find the expected codeword length per symbol of the Huffman binary code for the third-order extensions of this source. (c) Use MATLAB to find the expected codeword length per (source) symbol of the Huffman binary code for the fourth-order extensions of this source. (i) Put your answer here. (ii) Don t forget to attach the printout of your MALTAB script (highlighting the modified parts if you start from the provided class example) and the expressions/results displayed in the command window. (d) Use MATLAB to plot the expected codeword length per (source) symbol of the Huffman binary code for the nth-order extensions of this source for n = 1, 2,..., 8. Attach the printout of your plot. 1-4

5

6 ECS 452 HW 1 Due: Feb /2 Problem 6 (HW ). (Optional) The following claim is sometimes found in the literature: It can be shown that the length l(x) of the Huffman code of a symbol x with probability p X (x) is always less than or equal to log 2 p X (x). Even though it is correct in many cases, this claim is not true in general. Find an example where the length l(x) of the Huffman code of a symbol x is greater than log 2 p X (x). Hint: Consider a pmf that has the following four probability values {0.01, 0.30, 0.34, 0.35}. Problem 7 (HW ). (Optional) Construct a random variable X (by specifying its pmf) whose corresponding Huffman code is {0, 10, 11}. 1-5

7 ECS 452: Digital Communication Systems 2015/2 HW 2 Due: Feb 19 Lecturer: Asst. Prof. Dr. Prapun Suksompong Instructions (a) Must solve all non-optional problems. (5 pt) (i) Write your first name and the last three digit of your student ID on the upper-right corner of every submitted page. (ii) For each part, write your explanation/derivation and answer in the space provided. (b) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (c) Late submission will be rejected. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1. In this question, each output string from a DMS is encoded by the following source code: (a) Is the code prefix-free? x Codeword c(x) a 1 d 01 e 0000 i 001 o u (b) Is the code uniquely decodable? 2-1

8 ECS 452 HW 2 Due: Feb /2 (c) Suppose the DMS produces the string audio. Find the output of the source encoder. (d) Suppose the output of the source encoder is Find the corresponding source string produced by the DMS. Use / to indicate the locations where the sting above is split into codewords. Problem 2. A DMC has X = {0, 1} and Y = {1, 2, 3}. The following decoding table is used to decode the channel output. y ˆx(y) Suppose the channel output string is (a) Find the corresponding decoded string. (b) Suppose the channel input string is produced from an ASCII source encoder by the command dec2bin(sourcestring,7) in MATLAB. Assume that there is no channel decoding error. Find the corresponding source string. Problem 3 (HW ). Consider a BSC whose crossover probability for each bit is p = Suppose P [X = 0] = (a) Draw the channel diagram. 2-2

9 ECS 452 HW 2 Due: Feb /2 (b) Find the channel matrix Q. (c) Find the joint pmf matrix P. (d) Find the row vector q which contains the pmf of the channel output Y. (e) We now analyze the performance of all four reasonable detectors for this binary channel. Complete the table below: ˆx(y) y 1 y 1 0 [ ] P ˆX = 0 X = 0 [ ] P ˆX = 1 X = 1 (f) Find the MAP detector and its error probability. P (C) P (E) (g) Find the ML detector and its error probability. 2-3

10 ECS 452 HW 2 Due: Feb /2 Problem 4 (HW ). Consider a DMC whose X = {1, 2, 3}, Y = {1, 2, 3}, and Q = Suppose the input probability vector is p = [0.2, 0.4, 0.4] (a) Find the joint pmf matrix P. (b) Find the row vector q which contains the pmf of the channel output Y. (c) Find the error probability of the naive decoder. (d) Find the error probability of the (DIY) decoder ˆx(y) = 4 y. 2-4

11 ECS 452 HW 2 Due: Feb /2 (e) Find the MAP detector and its error probability. (f) Find the ML detector and its error probability. Problem 5 (HW ). Consider a BAC whose Q(1 0) = 0.35 and Q(0 1) = Suppose P [X = 0] = 0.4. (a) Draw the channel diagram. (b) Find the joint pmf matrix P. 2-5

12 ECS 452 HW 2 Due: Feb /2 (c) Find the row vector q which contains the pmf of the channel output Y. (d) We now analyze the performance of all four reasonable detectors for this binary channel. Complete the table below: ˆx(y) y 1 y 1 0 [ ] P ˆX = 0 X = 0 [ ] P ˆX = 1 X = 1 P (C) P (E) 2-6

13 ECS 452 HW 2 Due: Feb /2 (e) Find the MAP detector and its error probability. (f) Find the ML detector and its error probability. Problem 6 (HW ). (Optional) Consider a DMC whose samples of input X and output Y are recorded as row vectors x and y in the file HW DMC Channel Data.mat. Write MATLAB script which uses the recorded information to estimate the quantities below. Note that most of these can be solved by appropriate parts of the m-files posted on the course web site. (a) The support X of X. (b) The support Y of Y. (c) The row vector p which contains the pmf of X. (d) The Q matrix. (e) The row vector q which contains the pmf of Y. Do this using two methods: (i) Count directly from the observed values of Y. (ii) Use the estimated values of p and Q. 2-7

14 ECS 452 HW 2 Due: Feb /2 (f) The error probability when the naive decoder is used. Do this using two methods: (i) Directly construct ˆx from y. Then, compare ˆx and x. (ii) Use the estimated values of p and Q. (g) The error probability when the MAP decoder is used. Do this using two methods: (i) First find the MAP decoder table using the estimated values of p and Q. Then, construct ˆx from y according to the decoder table. Finally, compare ˆx and x. (ii) Use the estimated values of p and Q to directly calculate the error probability. (h) The error probability when the ML decoder is used. Do this using two methods: (i) First find the ML decoder table using the estimated value of Q. Then, construct ˆx from y according to the decoder table. Finally, compare ˆx and x. (ii) Use the estimated values of p and Q to directly calculate the error probability. 2-8

15 ECS452_HW_2015_2_Sol Page 1 Q6 MATLAB Simulation of the system in Q4 Saturday, September 13, 2014 % MATLAB Script for Q6 of HW2 for ECS 452 % By Asst. Prof. Dr. Prapun Suksompong. close all; clear all; tic 11:44 PM load HW_DMC_Channel_Data (a) S_X = unique(x) (b) S_Y = unique(y) %% Statistical Analysis n = length(x); (c) % The probability values for the channel inputs p_x_sim = hist(x,s_x)/n % Relative frequencies from the simulation (d) % The channel transition probabilities from the simulation Q_sim = []; for k = 1:length(S_X) I = find(x==s_x(k)); LI = length(i); rel_freq_xk = LI/n; yc = y(i); cond_rel_freq = hist(yc,s_y)/li; Q_sim = [Q_sim; cond_rel_freq]; end Q_sim % Relative frequencies from the simulation (e) (e.i) p_y_sim = hist(y,s_y)/n % Relative frequencies from the simulation (e.ii) p_y_sim2 = p_x_sim*q_sim %% p_x = p_x_sim; Q = Q_sim; (f) %% Naive Decoder x_hat = y;

16 (f.i) % Error Probability PE_sim_Naive = 1-sum(x==x_hat)/n % Error probability from the simulation (f.ii) % Calculation of the theoretical error probability PC = 0; for k = 1:length(S_X) t = S_X(k); i = find(s_y == t); if length(i) == 1 PC = PC+ p_x(k)*q(k,i); end end PE_theoretical_Naive = 1-PC (g) %% MAP Decoder P = diag(p_x)*q; % Weight the channel transition probability by the % corresponding prior probability. [V I] = max(p); % For I, the default MATLAB behavior is %that when there are multiple max, the % index of the first one is returned. Decoder_Table_MAP = S_X(I) % The decoded values corresponding to the % received Y (g.i) % Decode according to the decoder table x_hat = y; % preallocation for k = 1:length(S_Y) I = (y==s_y(k)); x_hat(i) = Decoder_Table_MAP(k); end PE_sim_MAP = 1-sum(x==x_hat)/n % Error probability from the simulation (g.ii) % Calculation of the theoretical error probability Decoder_Table = Decoder_Table_MAP; PC = 0; for k = 1:length(S_X) I = (Decoder_Table == S_X(k)); Q_row = Q(k,:); PC = PC+ p_x(k)*sum(q_row(i)); end PE_theoretical_MAP = 1-PC (h) %% ML Decoder [V I] = max(q); % For I, the default MATLAB behavior is %that when there are multiple max, the % index of the first one is returned. Decoder_Table_ML = S_X(I) % The decoded values corresponding to the ECS452_HW_2015_2_Sol Page 2

17 ECS452_HW_2015_2_Sol Page 3 % received Y % Decode according to the decoder table x_hat = y; % preallocation for k = 1:length(S_Y) I = (y==s_y(k)); x_hat(i) = Decoder_Table_ML(k); end (h.i) PE_sim_ML = 1-sum(x==x_hat)/n % Error probability from the simulation (h.ii) % Calculation of the theoretical error probability Decoder_Table = Decoder_Table_ML; PC = 0; for k = 1:length(S_X) I = (Decoder_Table == S_X(k)); Q_row = Q(k,:); PC = PC+ p_x(k)*sum(q_row(i)); end PE_theoretical_ML = 1-PC toc Results in the command window: >> HW_DMC_Channel_Estimation_2 S_X = S_Y = p_x_sim = Q_sim = p_y_sim = p_y_sim2 = PE_sim_Naive = PE_theoretical_Naive = Decoder_Table_MAP = PE_sim_MAP = PE_theoretical_MAP =

18 PE_theoretical_MAP = Decoder_Table_ML = PE_sim_ML = PE_theoretical_ML = Elapsed time is seconds. ECS452_HW_2015_2_Sol Page 4

19 ECS 452: Digital Communication Systems 2015/2 HW 3 Due: Feb 26 Lecturer: Asst. Prof. Dr. Prapun Suksompong Instructions (a) Solve all non-optional problems. (5 pt) (i) Write your first name and the last three digit of your student ID on the upper-right corner of every submitted page. (ii) For each part, write your explanation/derivation and answer in the space provided. (b) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (c) Late submission will be rejected. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1 (HW ). Consider a repetition code with a code rate of 1/5. Assume that the code is used with a BSC with a crossover probability p = 0.4. (a) Find the ML detector and its error probability. 3-1

20 ECS 452 HW 3 Due: Feb /2 (b) Suppose the info-bit S is generated with P [S = 0] = 1 P [S = 1] = 0.4. Find the MAP detector and its error probability. (c) Assume the info-bit S is generated with Suppose the receiver observes P [S = 0] = 1 P [S = 1] = (i) What is the probability that 0 was transmitted? (Do not forget that this is a conditional probability. The answer is not 0.45 because we have some extra information from the observed bits at the receiver.) 3-2

21 ECS 452 HW 3 Due: Feb /2 (ii) What is the probability that 1 was transmitted? (iii) Given the observed 01001, which event is more likely, S = 1 was transmitted or S = 0 was transmitted? Does your answer agree with the majority voting rule for decoding? (d) Assume that the source produces source bit S with P [S = 0] = 1 P [S = 1] = p 0. Suppose the receiver observes (i) What is the probability that 0 was transmitted? (ii) What is the probability that 1 was transmitted? 3-3

22 ECS 452 HW 3 Due: Feb /2 (iii) Given the observed 01001, which event is more likely, S = 1 was transmitted or S = 0 was transmitted? Your answer may depend on the value of p 0. Does your answer agree with the majority voting rule for decoding? Problem 2 (HW ). A channel encoder map blocks of two bits to five-bit (channel) codewords. The four possible codewords are 00000, 01000, 10001, and A codeword is transmitted over the BSC with crossover probability p = 0.1. (a) What is the minimum (Hamming) distance d min among the codewords? (b) Suppose the codeword x = was transmitted. What is the probability that the receiver observes y = at the output of the BSC. 3-4

23 ECS 452 HW 3 Due: Feb /2 (c) Suppose the receiver observes at the output of the BSC. (i) Assume that all four codewords are equally likely to be transmitted. Given the observed at the receiver, what is the most likely codeword that was transmitted? (ii) Assume that the four codewords are not equally likely. Suppose is transmitted more frequently with probability 0.7. The other three codewords are transmitted with probability 0.1 each. Given the observed at the receiver, what is the most likely codeword that was transmitted? 3-5

24 ECS 452 HW 3 Due: Feb /2 Problem 3 (HW , Optional). Optimal code lengths that require one bit above entropy: The source coding theorem says that the Huffman code for a random variable X has an expected length strictly less than H(X) + 1. Give an example of a random variable for which the expected length of the Huffman code (without any source extension) is very close to H(X)

25

26

27 ECS 452: Digital Communication Systems 2015/2 HW 4 Due: Not Due Lecturer: Asst. Prof. Dr. Prapun Suksompong Problem 1 (HW , Free). In each row of the table below, compare the entropy H(X) of the random variable X in the first column with the entropy H(Y ) of the random variable Y in the third column by writing >, =, or < in the second column. Watch out for approximation and round-off error. H(X) when p = [0.3, 0.7]. H(Y ) when q = [0.8, 0.2]. H(X) when p = [0.3, 0.3, 0.4]. H(Y ) when q = [0.4, 0.3, 0.3]. 0.3, x {1, 2}, H(X) when p (x) = 0.2, x {3, 4}, H(Y ) when q = [0.4, 0.3, 0.3]. 0, otherwise. Problem 2 (HW , Free). Consider random variables X and Y whose joint pmf is given by { c (x + y), x {1, 3} and y {2, 4}, p X,Y (x, y) = 0, otherwise. Evaluate the following quantities. (a) c (b) H(X, Y ) (c) H(X) (d) H(Y ) (e) H(X Y ) (f) H(Y X) (g) I(X; Y ) 4-1

28 ECS 452 HW 4 Due: Not Due 2015/2 Problem 3 (HW , Free). Consider a pair of random variables X and Y whose joint pmf is given by 1/15, x = 3, y = 1, 2/15, x = 4, y = 1, p X,Y (x, y) = 4/15, x = 3, y = 3, β, x = 4, y = 3, 0, otherwise. (a) Find the value of the constant β. (b) Are X and Y independent? (c) Evaluate the following quantities. (i) H(X) (ii) H(Y ) (iii) H(X, Y ) (iv) H(X Y ) (v) H(Y X) (vi) I(X; Y ) 4-2

Digital Communication Systems ECS 452

Digital Communication Systems ECS 452 Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th Discrete Channel Office Hours: BKD 360-7 Monday 4:00-6:00 Wednesday 4:40-6:00 Noise & Interference Elements

More information

Digital Communication Systems ECS 452

Digital Communication Systems ECS 452 Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th 3 Discrete Memoryless Channel (DMC) Office Hours: BKD, 6th floor of Sirindhralai building Tuesday 4:20-5:20

More information

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7 ECS 332: Principles of Communications 2012/1 HW 4 Due: Sep 7 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

HW Solution 3 Due: July 15

HW Solution 3 Due: July 15 ECS 315: Probability and Random Processes 2010/1 HW Solution 3 Due: July 15 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) A part of ONE question will be graded. Of course, you do not know which problem

More information

HW 13 Due: Dec 6, 5 PM

HW 13 Due: Dec 6, 5 PM ECS 315: Probability and Random Processes 2016/1 HW 13 Due: Dec 6, 5 PM Lecturer: Prapun Suksompong, Ph.D. Instructions (a) This assignment has 8 pages. (b) (1 pt) Write your first name and the last three

More information

Homework Set #2 Data Compression, Huffman code and AEP

Homework Set #2 Data Compression, Huffman code and AEP Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

EE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Midterm, Tuesday February 10th. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Midterm, Tuesday February 10th Instructions: You have two hours, 7PM - 9PM The exam has 3 questions, totaling 100 points. Please start answering each question on a new page

More information

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code Chapter 3 Source Coding 3. An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code 3. An Introduction to Source Coding Entropy (in bits per symbol) implies in average

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

4 An Introduction to Channel Coding and Decoding over BSC

4 An Introduction to Channel Coding and Decoding over BSC 4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the

More information

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code Chapter 2 Date Compression: Source Coding 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code 2.1 An Introduction to Source Coding Source coding can be seen as an efficient way

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

HW Solution 12 Due: Dec 2, 9:19 AM

HW Solution 12 Due: Dec 2, 9:19 AM ECS 315: Probability and Random Processes 2015/1 HW Solution 12 Due: Dec 2, 9:19 AM Lecturer: Prapun Suksompong, Ph.D. Problem 1. Let X E(3). (a) For each of the following function g(x). Indicate whether

More information

Motivation for Arithmetic Coding

Motivation for Arithmetic Coding Motivation for Arithmetic Coding Motivations for arithmetic coding: 1) Huffman coding algorithm can generate prefix codes with a minimum average codeword length. But this length is usually strictly greater

More information

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for MARKOV CHAINS A finite state Markov chain is a sequence S 0,S 1,... of discrete cv s from a finite alphabet S where q 0 (s) is a pmf on S 0 and for n 1, Q(s s ) = Pr(S n =s S n 1 =s ) = Pr(S n =s S n 1

More information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information 4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk

More information

SIGNAL COMPRESSION Lecture Shannon-Fano-Elias Codes and Arithmetic Coding

SIGNAL COMPRESSION Lecture Shannon-Fano-Elias Codes and Arithmetic Coding SIGNAL COMPRESSION Lecture 3 4.9.2007 Shannon-Fano-Elias Codes and Arithmetic Coding 1 Shannon-Fano-Elias Coding We discuss how to encode the symbols {a 1, a 2,..., a m }, knowing their probabilities,

More information

1 Introduction to information theory

1 Introduction to information theory 1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through

More information

3F1 Information Theory, Lecture 3

3F1 Information Theory, Lecture 3 3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2011, 28 November 2011 Memoryless Sources Arithmetic Coding Sources with Memory 2 / 19 Summary of last lecture Prefix-free

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

Lecture 1: Shannon s Theorem

Lecture 1: Shannon s Theorem Lecture 1: Shannon s Theorem Lecturer: Travis Gagie January 13th, 2015 Welcome to Data Compression! I m Travis and I ll be your instructor this week. If you haven t registered yet, don t worry, we ll work

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE) ECE 74 - Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE) 1. A Huffman code finds the optimal codeword to assign to a given block of source symbols. (a) Show that cannot be a Huffman

More information

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H. Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

2018/5/3. YU Xiangyu

2018/5/3. YU Xiangyu 2018/5/3 YU Xiangyu yuxy@scut.edu.cn Entropy Huffman Code Entropy of Discrete Source Definition of entropy: If an information source X can generate n different messages x 1, x 2,, x i,, x n, then the

More information

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase

More information

Communications Theory and Engineering

Communications Theory and Engineering Communications Theory and Engineering Master's Degree in Electronic Engineering Sapienza University of Rome A.A. 2018-2019 AEP Asymptotic Equipartition Property AEP In information theory, the analog of

More information

COMM901 Source Coding and Compression. Quiz 1

COMM901 Source Coding and Compression. Quiz 1 German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Winter Semester 2013/2014 Students Name: Students ID: COMM901 Source Coding

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information

4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak

4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak 4. Quantization and Data Compression ECE 32 Spring 22 Purdue University, School of ECE Prof. What is data compression? Reducing the file size without compromising the quality of the data stored in the

More information

Solutions to Set #2 Data Compression, Huffman code and AEP

Solutions to Set #2 Data Compression, Huffman code and AEP Solutions to Set #2 Data Compression, Huffman code and AEP. Huffman coding. Consider the random variable ( ) x x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0. 0.04 0.04 0.03 0.02 (a) Find a binary Huffman code

More information

EE 229B ERROR CONTROL CODING Spring 2005

EE 229B ERROR CONTROL CODING Spring 2005 EE 229B ERROR CONTROL CODING Spring 2005 Solutions for Homework 1 1. Is there room? Prove or disprove : There is a (12,7) binary linear code with d min = 5. If there were a (12,7) binary linear code with

More information

3F1 Information Theory, Lecture 3

3F1 Information Theory, Lecture 3 3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2013, 29 November 2013 Memoryless Sources Arithmetic Coding Sources with Memory Markov Example 2 / 21 Encoding the output

More information

EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16

EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16 EE539R: Problem Set 4 Assigned: 3/08/6, Due: 07/09/6. Cover and Thomas: Problem 3.5 Sets defined by probabilities: Define the set C n (t = {x n : P X n(x n 2 nt } (a We have = P X n(x n P X n(x n 2 nt

More information

Chapter 5: Data Compression

Chapter 5: Data Compression Chapter 5: Data Compression Definition. A source code C for a random variable X is a mapping from the range of X to the set of finite length strings of symbols from a D-ary alphabet. ˆX: source alphabet,

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

U Logo Use Guidelines

U Logo Use Guidelines COMP2610/6261 - Information Theory Lecture 15: Arithmetic Coding U Logo Use Guidelines Mark Reid and Aditya Menon logo is a contemporary n of our heritage. presents our name, d and our motto: arn the nature

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Entropy as a measure of surprise

Entropy as a measure of surprise Entropy as a measure of surprise Lecture 5: Sam Roweis September 26, 25 What does information do? It removes uncertainty. Information Conveyed = Uncertainty Removed = Surprise Yielded. How should we quantify

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Coding of memoryless sources 1/35

Coding of memoryless sources 1/35 Coding of memoryless sources 1/35 Outline 1. Morse coding ; 2. Definitions : encoding, encoding efficiency ; 3. fixed length codes, encoding integers ; 4. prefix condition ; 5. Kraft and Mac Millan theorems

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

10-704: Information Processing and Learning Fall Lecture 10: Oct 3

10-704: Information Processing and Learning Fall Lecture 10: Oct 3 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 0: Oct 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy of

More information

Chapter 2: Source coding

Chapter 2: Source coding Chapter 2: meghdadi@ensil.unilim.fr University of Limoges Chapter 2: Entropy of Markov Source Chapter 2: Entropy of Markov Source Markov model for information sources Given the present, the future is independent

More information

Math 512 Syllabus Spring 2017, LIU Post

Math 512 Syllabus Spring 2017, LIU Post Week Class Date Material Math 512 Syllabus Spring 2017, LIU Post 1 1/23 ISBN, error-detecting codes HW: Exercises 1.1, 1.3, 1.5, 1.8, 1.14, 1.15 If x, y satisfy ISBN-10 check, then so does x + y. 2 1/30

More information

Lecture Notes on Digital Transmission Source and Channel Coding. José Manuel Bioucas Dias

Lecture Notes on Digital Transmission Source and Channel Coding. José Manuel Bioucas Dias Lecture Notes on Digital Transmission Source and Channel Coding José Manuel Bioucas Dias February 2015 CHAPTER 1 Source and Channel Coding Contents 1 Source and Channel Coding 1 1.1 Introduction......................................

More information

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding Homework Set # Rates definitions, Channel Coding, Source-Channel coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits

More information

EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018

EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018 Please submit the solutions on Gradescope. Some definitions that may be useful: EE376A: Homework #2 Solutions Due by 11:59pm Thursday, February 1st, 2018 Definition 1: A sequence of random variables X

More information

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7

Electrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7 Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling

More information

Solutions to Homework Set #3 Channel and Source coding

Solutions to Homework Set #3 Channel and Source coding Solutions to Homework Set #3 Channel and Source coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits per channel use)

More information

Distributed Source Coding Using LDPC Codes

Distributed Source Coding Using LDPC Codes Distributed Source Coding Using LDPC Codes Telecommunications Laboratory Alex Balatsoukas-Stimming Technical University of Crete May 29, 2010 Telecommunications Laboratory (TUC) Distributed Source Coding

More information

Linear Block Codes. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Linear Block Codes. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 26 Linear Block Codes Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay July 28, 2014 Binary Block Codes 3 / 26 Let F 2 be the set

More information

LECTURE 10. Last time: Lecture outline

LECTURE 10. Last time: Lecture outline LECTURE 10 Joint AEP Coding Theorem Last time: Error Exponents Lecture outline Strong Coding Theorem Reading: Gallager, Chapter 5. Review Joint AEP A ( ɛ n) (X) A ( ɛ n) (Y ) vs. A ( ɛ n) (X, Y ) 2 nh(x)

More information

Lecture 15: Conditional and Joint Typicaility

Lecture 15: Conditional and Joint Typicaility EE376A Information Theory Lecture 1-02/26/2015 Lecture 15: Conditional and Joint Typicaility Lecturer: Kartik Venkat Scribe: Max Zimet, Brian Wai, Sepehr Nezami 1 Notation We always write a sequence of

More information

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar Introduction to Information Theory By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar Introduction [B.P. Lathi] Almost in all the means of communication, none produces error-free communication.

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

Lecture 14: Hamming and Hadamard Codes

Lecture 14: Hamming and Hadamard Codes CSCI-B69: A Theorist s Toolkit, Fall 6 Oct 6 Lecture 4: Hamming and Hadamard Codes Lecturer: Yuan Zhou Scribe: Kaiyuan Zhu Recap Recall from the last lecture that error-correcting codes are in fact injective

More information

Intro to Information Theory

Intro to Information Theory Intro to Information Theory Math Circle February 11, 2018 1. Random variables Let us review discrete random variables and some notation. A random variable X takes value a A with probability P (a) 0. Here

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

BASIC COMPRESSION TECHNIQUES

BASIC COMPRESSION TECHNIQUES BASIC COMPRESSION TECHNIQUES N. C. State University CSC557 Multimedia Computing and Networking Fall 2001 Lectures # 05 Questions / Problems / Announcements? 2 Matlab demo of DFT Low-pass windowed-sinc

More information

Lecture 1. Introduction

Lecture 1. Introduction Lecture 1. Introduction What is the course about? Logistics Questionnaire Dr. Yao Xie, ECE587, Information Theory, Duke University What is information? Dr. Yao Xie, ECE587, Information Theory, Duke University

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00 NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR Sp ' 00 May 3 OPEN BOOK exam (students are permitted to bring in textbooks, handwritten notes, lecture notes

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet) Compression Motivation Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet) Storage: Store large & complex 3D models (e.g. 3D scanner

More information

Information Theory. David Rosenberg. June 15, New York University. David Rosenberg (New York University) DS-GA 1003 June 15, / 18

Information Theory. David Rosenberg. June 15, New York University. David Rosenberg (New York University) DS-GA 1003 June 15, / 18 Information Theory David Rosenberg New York University June 15, 2015 David Rosenberg (New York University) DS-GA 1003 June 15, 2015 1 / 18 A Measure of Information? Consider a discrete random variable

More information

Lecture 1 : Data Compression and Entropy

Lecture 1 : Data Compression and Entropy CPS290: Algorithmic Foundations of Data Science January 8, 207 Lecture : Data Compression and Entropy Lecturer: Kamesh Munagala Scribe: Kamesh Munagala In this lecture, we will study a simple model for

More information

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70 Name : Roll No. :.... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/2011-12 2011 CODING & INFORMATION THEORY Time Allotted : 3 Hours Full Marks : 70 The figures in the margin indicate full marks

More information

Coding for Discrete Source

Coding for Discrete Source EGR 544 Communication Theory 3. Coding for Discrete Sources Z. Aliyazicioglu Electrical and Computer Engineering Department Cal Poly Pomona Coding for Discrete Source Coding Represent source data effectively

More information

10-704: Information Processing and Learning Fall Lecture 9: Sept 28

10-704: Information Processing and Learning Fall Lecture 9: Sept 28 10-704: Information Processing and Learning Fall 2016 Lecturer: Siheng Chen Lecture 9: Sept 28 Note: These notes are based on scribed notes from Spring15 offering of this course. LaTeX template courtesy

More information

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture) ECE 564/645 - Digital Communications, Spring 018 Homework # Due: March 19 (In Lecture) 1. Consider a binary communication system over a 1-dimensional vector channel where message m 1 is sent by signaling

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST

Information Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST Information Theory Lecture 5 Entropy rate and Markov sources STEFAN HÖST Universal Source Coding Huffman coding is optimal, what is the problem? In the previous coding schemes (Huffman and Shannon-Fano)it

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

Lecture 4 : Adaptive source coding algorithms

Lecture 4 : Adaptive source coding algorithms Lecture 4 : Adaptive source coding algorithms February 2, 28 Information Theory Outline 1. Motivation ; 2. adaptive Huffman encoding ; 3. Gallager and Knuth s method ; 4. Dictionary methods : Lempel-Ziv

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road 517583 QUESTION BANK (DESCRIPTIVE) Subject with Code : CODING THEORY & TECHNIQUES(16EC3810) Course & Branch: M.Tech - DECS

More information

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel

More information

Huffman Coding. C.M. Liu Perceptual Lab, College of Computer Science National Chiao-Tung University

Huffman Coding. C.M. Liu Perceptual Lab, College of Computer Science National Chiao-Tung University Huffman Coding C.M. Liu Perceptual Lab, College of Computer Science National Chiao-Tung University http://www.csie.nctu.edu.tw/~cmliu/courses/compression/ Office: EC538 (03)573877 cmliu@cs.nctu.edu.tw

More information

Lecture B04 : Linear codes and singleton bound

Lecture B04 : Linear codes and singleton bound IITM-CS6845: Theory Toolkit February 1, 2012 Lecture B04 : Linear codes and singleton bound Lecturer: Jayalal Sarma Scribe: T Devanathan We start by proving a generalization of Hamming Bound, which we

More information

Linear Codes and Syndrome Decoding

Linear Codes and Syndrome Decoding Linear Codes and Syndrome Decoding These notes are intended to be used as supplementary reading to Sections 6.7 9 of Grimaldi s Discrete and Combinatorial Mathematics. The proofs of the theorems are left

More information

HW Solution 2 Due: July 10:39AM

HW Solution 2 Due: July 10:39AM ECS 35: Probability and Random Processes 200/ HW Solution 2 Due: July 9 @ 0:39AM Lecturer: Prapun Suksompong, Ph.D. Instructions (a) A part of ONE question will be graded. Of course, you do not know which

More information

Lecture 3 : Algorithms for source coding. September 30, 2016

Lecture 3 : Algorithms for source coding. September 30, 2016 Lecture 3 : Algorithms for source coding September 30, 2016 Outline 1. Huffman code ; proof of optimality ; 2. Coding with intervals : Shannon-Fano-Elias code and Shannon code ; 3. Arithmetic coding. 1/39

More information

Answers and Solutions to (Even Numbered) Suggested Exercises in Sections of Grimaldi s Discrete and Combinatorial Mathematics

Answers and Solutions to (Even Numbered) Suggested Exercises in Sections of Grimaldi s Discrete and Combinatorial Mathematics Answers and Solutions to (Even Numbered) Suggested Exercises in Sections 6.5-6.9 of Grimaldi s Discrete and Combinatorial Mathematics Section 6.5 6.5.2. a. r = = + = c + e. So the error pattern is e =.

More information

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding Ch 0 Introduction 0.1 Overview of Information Theory and Coding Overview The information theory was founded by Shannon in 1948. This theory is for transmission (communication system) or recording (storage

More information

Solutions to problems from Chapter 3

Solutions to problems from Chapter 3 Solutions to problems from Chapter 3 Manjunatha. P manjup.jnnce@gmail.com Professor Dept. of ECE J.N.N. College of Engineering, Shimoga February 28, 2016 For a systematic (7,4) linear block code, the parity

More information

X 1 : X Table 1: Y = X X 2

X 1 : X Table 1: Y = X X 2 ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access

More information

DCSP-3: Minimal Length Coding. Jianfeng Feng

DCSP-3: Minimal Length Coding. Jianfeng Feng DCSP-3: Minimal Length Coding Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk http://www.dcs.warwick.ac.uk/~feng/dcsp.html Automatic Image Caption (better than

More information

Lecture 1: September 25, A quick reminder about random variables and convexity

Lecture 1: September 25, A quick reminder about random variables and convexity Information and Coding Theory Autumn 207 Lecturer: Madhur Tulsiani Lecture : September 25, 207 Administrivia This course will cover some basic concepts in information and coding theory, and their applications

More information

Electrical Engineering Written PhD Qualifier Exam Spring 2014

Electrical Engineering Written PhD Qualifier Exam Spring 2014 Electrical Engineering Written PhD Qualifier Exam Spring 2014 Friday, February 7 th 2014 Please do not write your name on this page or any other page you submit with your work. Instead use the student

More information

Mathematics Department

Mathematics Department Mathematics Department Matthew Pressland Room 7.355 V57 WT 27/8 Advanced Higher Mathematics for INFOTECH Exercise Sheet 2. Let C F 6 3 be the linear code defined by the generator matrix G = 2 2 (a) Find

More information